MetaGPT
MetaGPT copied to clipboard
Limit error when running program
When running the program, I get the following error. Is there a way to resolve the limitations?
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, you requested 4264 tokens (1264 in the messages, 3000 in the completion). Please reduce the length of the messages or completion.
Modify the config in config/key.yaml.
OPENAI_API_MODELchange togpt-3.5-turbo-16korgpt-4.MAX_TOKENSchange to5000.
Thank you for your response. I tried setting to turbo as you stated, and 5000 MAX_TOKENS, but I now get this error. Please advise and thanks for your help.
result = await fn(*args, **kwargs)
File "C:\MetaGPT\metagpt\actions\action.py", line 62, in _aask_v1 instruct_content = output_class(**parsed_data) File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 7 validation errors for prd Requirement Pool -> 0 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 1 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 2 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 3 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 4 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 5 value is not a valid tuple (type=type_error.tuple) Requirement Pool -> 6 value is not a valid tuple (type=type_error.tuple)
This problem actually comes from the poor Instruction Following of gpt-3.5-turbo. gpt-4 basically does not have this problem