auto-codebase-documenter
auto-codebase-documenter copied to clipboard
INFO:openai:error_code=context_length_exceeded error_message="This model's maximum context length is 4097 tokens. However, your messages resulted in 4651 tokens. Please reduce the length of the messages." error_param=messages error_type=invalid_request_error message='OpenAI API error received' stream_error=False
The number of tokens is overshoot how can we fix that?
it s here. it seems also that it is hardcoded for gpt3.5 for now :)
I believe that will be an issue for large files. It's a limitation on the underlying LLM being used, however I will look to add some checks, appropriate warnings, and error handling if that error is encountered in the next version.
In terms of the model being used, that can be overridden with a custom documenter_config.yaml file with the gpt_model attribute value. gpt-4 should give you more tokens in the input.
Watch out for a new version dropping soon with these fixes.