dspy icon indicating copy to clipboard operation
dspy copied to clipboard

feature(dsp) Remove default max token param from OpenAI

Open Vasilije1990 opened this issue 4 months ago • 3 comments

For sending data to models with large context, this causes an artificial limit of outbound text and limits to max 8000 tokens. You are unable to set a large max_token value due to inherent limitations, and with a small value, it is hard to get more complex prompts to work for knowledge graph generation.

OpenAI sets it to infinite, which is optimal.

Vasilije1990 avatar Apr 19 '24 12:04 Vasilije1990

Hi @Vasilije1990 , thanks for the PR. unfortunately, this can't be merged since it will break existing caches. Why can you not set the max_tokens within your LM configuration?

arnavsinghvi11 avatar Apr 28 '24 00:04 arnavsinghvi11