llm icon indicating copy to clipboard operation
llm copied to clipboard

Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 8285 tokens. Please reduce the length of the messages.

Open irthomasthomas opened this issue 5 months ago • 0 comments

Hi, I'm seeing the following error when using the default model, 3.5:

Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 8285 tokens. Please reduce the length of the messages.

Does the llm function default to using the new chatgpt-3.5-turbo-1106 model? I've noticed that in the models section of the configuration file, there are separate entries for gpt-4 and gpt-4-1106-preview, but there is only one entry for gpt-3.5-turbo. Does this mean I need to add the 1106 model to the openai models YAML file?

Thanks.

irthomasthomas avatar Jan 20 '24 14:01 irthomasthomas