AutoGPT
AutoGPT copied to clipboard
Command browse_website returned: Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 5023 tokens. Please reduce the length of the messages.
Model token length limit defined in .env
does not seem to propagate to all relevant parts of the code.
I am using gpt-3.5-turbo
and have set token limits to 4000 in .env
I am using the latest stable branch, in windows and invoked with run.bat
################################################################################
AUTO-GPT - GENERAL SETTINGS
################################################################################
BROWSE_CHUNK_MAX_LENGTH - When browsing website, define the length of chunk stored in mem>
BROWSE_CHUNK_MAX_LENGTH=3000 <----------- set this to 3000, or under 4000, its prolly set to 8172 or something
Nope. Already at 3000
read_file has same issue
i put mine to 2k and no errors since.
Closing as duplicate of #796. Please post there if the issue is still relevant.