autoflow icon indicating copy to clipboard operation
autoflow copied to clipboard

Support context_window config for OpenAILike / Ollama when create LLM

Open Mini256 opened this issue 1 year ago • 0 comments

Although the current LLM creation page can support configuration context_window in JSON config, this is very easy to be ignored. This will lead to a reported error related to token limitation.

The OpenAI LLMclass class will maintain the context_window of different models internally, but the third-party providers such as OpenAILike / Ollama won't, they use the default value (3900) of LlamaIndex (currently to increase the default value of 200000 for workaround).

Mini256 avatar Dec 26 '24 03:12 Mini256