spacy-llm icon indicating copy to clipboard operation
spacy-llm copied to clipboard

GPT4 context_length is not working

Open cubesnyc opened this issue 2 months ago • 0 comments

with the following entry in my config file

[components.llm.model]
@llm_models = "spacy.GPT-4.v3"
name = "gpt-4-turbo"
config = {"temperature": 0.0}
context_length = 110000

The context_length seems to be read correctly from the config, since it suppresses the sharding context_length error. However, I am still getting context length errors returned from openai, saying the corresponding text exceeds the token limit.

cubesnyc avatar Apr 15 '24 19:04 cubesnyc