spacy-llm
spacy-llm copied to clipboard
GPT4 context_length is not working
with the following entry in my config file
[components.llm.model]
@llm_models = "spacy.GPT-4.v3"
name = "gpt-4-turbo"
config = {"temperature": 0.0}
context_length = 110000
The context_length seems to be read correctly from the config, since it suppresses the sharding context_length error. However, I am still getting context length errors returned from openai, saying the corresponding text exceeds the token limit.