AutoGPT icon indicating copy to clipboard operation
AutoGPT copied to clipboard

Maximum context length exceeded after `execute_shell`

Open gtx-cyber opened this issue 1 year ago • 4 comments

⚠️ Search for existing issues first ⚠️

  • [X] I have searched the existing issues, and there is no existing issue for my problem

Which Operating System are you using?

Linux

Which version of Auto-GPT are you using?

Latest Release

GPT-3 or GPT-4?

GPT-3.5

Steps to reproduce 🕹

this error came from an installation of a library within the AutoGPT process while running NEXT ACTION: COMMAND = execute_shell ARGUMENTS = {'command_line': 'pip install en_core_web_sm'} Executing command 'pip install en_core_web_sm' in working directory '/home/appuser/auto_gpt_workspace'

Current behavior 😯

openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 9956 tokens (9956 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.

And the program terminates

Expected behavior 🤔

It should auto reduce token length instead of terminating

Your prompt 📝

# Paste your prompt here

Your Logs 📒

<insert your logs here>

gtx-cyber avatar Apr 25 '23 18:04 gtx-cyber

I hit the same with:

NEXT ACTION:  COMMAND = execute_shell ARGUMENTS = {'command_line': 'pip list --outdated'}
Executing command 'pip list --outdated' in working directory '/Users/../Auto-GPT-0.2.2/auto_gpt_workspace'

sorokinvj avatar Apr 25 '23 19:04 sorokinvj

I have experienced the same issue

perrosnk avatar Apr 25 '23 19:04 perrosnk

fyi i reran mine after the same kind of crash, and when prompted i told it to, "decrease token size because you keep erroring out," and i mean it worked afterwards so (i also manually accepted each prompt for a few afterwards before giving it -n)

brngdsn avatar Apr 25 '23 23:04 brngdsn

I get error "NEXT ACTION: COMMAND = search_files ARGUMENTS = {'directory': '.'} Traceback (most recent call last): File "", line 198, in run_module_as_main File "", line 88, in run_code File "C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT\autogpt_main.py", line 53, in main() File "C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT\autogpt_main.py", line 49, in main agent.start_interaction_loop() File "C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT\autogpt\agent\agent.py", line 170, in start_interaction_loop self.memory.add(memory_to_add) File "C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT\autogpt\memory\local.py", line 76, in add embedding = create_embedding_with_ada(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT\autogpt\llm_utils.py", line 137, in create_embedding_with_ada return openai.Embedding.create( ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dell User\AppData\Roaming\Python\Python311\site-packages\openai\api_resources\embedding.py", line 33, in create response = super().create(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dell User\AppData\Roaming\Python\Python311\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 153, in create response, _, api_key = requestor.request( ^^^^^^^^^^^^^^^^^^ File "C:\Users\Dell User\AppData\Roaming\Python\Python311\site-packages\openai\api_requestor.py", line 226, in request resp, got_stream = self._interpret_response(result, stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dell User\AppData\Roaming\Python\Python311\site-packages\openai\api_requestor.py", line 619, in _interpret_response self._interpret_response_line( File "C:\Users\Dell User\AppData\Roaming\Python\Python311\site-packages\openai\api_requestor.py", line 682, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 163109 tokens (163109 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.

C:\Users\Dell User\OneDrive\Desktop\darren\Auto-GPT>" I think this is about tokens any help would be appriciated.

DMTarmey avatar Apr 26 '23 07:04 DMTarmey