ram-sh
ram-sh
I had the same issue, your code fixed that problem!!
I'm facing the same error (using `stable`) when working with code files, Auto-GPT crashes when trying to read large files: `openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however...
where in Azure do you find `gpt35-deployment-id-for-azure`? i have deployed the model but i cant find that ID... any thoughts? 
is there a working solution for the issue?
> I think by default we want agents to work this way; that you can end a session for whatever reason and then return where it left off. i couldn't...
> following @BaltekLabs [solution ](https://github.com/Significant-Gravitas/Auto-GPT/issues/430#issuecomment-1506816266) i wanted to add some code to Auto-GPT to do it, first question is if the `auto-gpt.json` has all the conversation in it or we...
to bypass the issue I've commented out the following lines from the function `write_to_file` in `/workspaces/Auto-GPT-0.2.2/autogpt/commands/file_operations.py`: `# if check_duplicate_operation("write", filename):` `# return "Error: File has already been updated."`
GPU would be very useful.
more info: Distributor ID: Ubuntu Description: Ubuntu 18.04.3 LTS Release: 18.04 Codename: bionic yowsup-cli v3.2.0 yowsup v3.2.3 consonance v0.1.3-1 dissononce v0.34.3 python-axolotl v0.2.2 cryptography v2.8 protobuf v3.11.3 Python 3.6.9 ```...
another clean setup, same problem: Distributor ID: Ubuntu Description: Ubuntu 18.04.3 LTS Release: 18.04 Codename: bionic yowsup-cli v3.2.0 yowsup v3.2.3 consonance v0.1.3-1 dissononce v0.34.3 python-axolotl v0.2.2 cryptography v2.8 protobuf v3.11.3...