Brad Nickel
Brad Nickel
Per #363 I didn't send any files to the LLM and got this message. Anything having to do with binary files was generated by. gptpilot if that was the cause,...
I got this message when I ran the curl command: {"error":"Unexpected endpoint or method. (GET /v1/chat/completions)"}% In options, this option is the only one that is off: Cross-Origin-Resource-Sharing (CORS)
Thanks. I turned that on and ran it again and getting the same message when I attempt to chat.
Just running from command line in Terminal. Installed via PIP.
zsh: /opt/homebrew/bin/pip: bad interpreter: /opt/homebrew/opt/[email protected]/bin/python3.11: no such file or directory Name: urllib3 Version: 1.26.15 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: https://urllib3.readthedocs.io/ Author: Andrey Petrov...
Thanks. I actually did that about 10 minutes after I sent you that message. I now have a fresh install of 3.12. Sadly though, I am still getting the same...
Is there a particular LLM that you guys know works well with LM Studio and OI? I am using variations on Mistral and Code Llama, but may be an issue...
I am going to try again and will let you know.
Really would like to make this work. Any ideas @Notnaton or @ericrallen ?
OK, I have Open-Interpreter installed in Conda env running Python 3.11 I have LM Studio server running I run interpreter --local Get the instructions run LM Studio, etc. Then I...