dspy
dspy copied to clipboard
feature(dsp) Remove default max token param from OpenAI
For sending data to models with large context, this causes an artificial limit of outbound text and limits to max 8000 tokens. You are unable to set a large max_token value due to inherent limitations, and with a small value, it is hard to get more complex prompts to work for knowledge graph generation.
OpenAI sets it to infinite, which is optimal.
Hi @Vasilije1990 , thanks for the PR. unfortunately, this can't be merged since it will break existing caches. Why can you not set the max_tokens within your LM configuration?
Hi @arnavsinghvi11 My understanding was that the dspy overrides the max token setting on the API. This is why I wanted to make it non-explicit. Would gladly drop this if I am wrong. How do you mean to set it on the API?
Hi @Vasilije1990 , your understanding is correct that you can pass dspy.OpenAI(max_tokens = 250, ...) to override the existing default arguments, which doesn't require removing the currently set arguments. Currently, the existing caches that make the DSPy intro.ipynb notebook runnable without an api_key rely on this default behavior, and removing the default definitions would break those. Let me know if that makes sense so we can close this PR.