autoflow
autoflow copied to clipboard
Support context_window config for OpenAILike / Ollama when create LLM
Although the current LLM creation page can support configuration context_window in JSON config, this is very easy to be ignored. This will lead to a reported error related to token limitation.
The OpenAI LLMclass class will maintain the context_window of different models internally, but the third-party providers such as OpenAILike / Ollama won't, they use the default value (3900) of LlamaIndex (currently to increase the default value of 200000 for workaround).