gpt-pilot
gpt-pilot copied to clipboard
[Bug]: There was a problem with request to openai API: 'choices'
Version
VisualStudio Code extension
Operating System
Windows 10
What happened?
ENDPOINT=OPENROUTER
MODEL_NAME=anthropic/claude-3-opus
I create a new app and it seems to make a few succesfull calls, but then get this error :
There was a problem with request to openai API: 'choices'
I have this too, Windows 11, Trying to run through LM Studio (OpenAI Prefs)
This is because you are using: "http://localhost:5001/v1"
you should use: http://localhost:5001/v1/chat/completions
This choises error is because the LM studio doesnt the get request correctly and shows: [2024-03-19 20:18:18.629] [ERROR] Unexpected endpoint or method. (POST /). Returning 200 anyway
Our recent release with Anthropic API support messed up OpenRouter Anthropic support. We've since made a new release with a fix. @BorisMolch @nessystudio can you please try again and see if the latest GPT Pilot release fixes it?
(if using through the Pythagora extension, it should autoupdate on start, you can check Settings to verify the latest version)
This error is not related to LM studio on local host, or Anthropic API messing up OpenRouter.
What happens is that for whatever reason GPT-pilot strips part of the LLM model name from .evn if it contains a slash in its name.
With OpenRouter you need to declare model names with a provider prefix, like this: anthropic/claude-3-opus perplexity/pplx-70b-chat meta-llama/codellama-34b-instruct etc.
However when GPT-pilot makes an API request it strips away the provider part from the model name, so instead of sending "meta-llama/codellama-34b-instruct" it sends "codellama-34b-instruct" which results in error, model not available, like this:
There was a problem with request to openai API: API responded with status code: 404. Request token size: 23 tokens. Response text: {"error":{"message":"Model codellama-34b-instruct is not available","code":404},"user_id":"user_xxxxxxxxxxxxxxx"}
Writing as model mistralai/mistralai/mixtral-8x7b-instruct worked for me with OpenRouter
Writing as model
mistralai/mistralai/mixtral-8x7b-instructworked for me with OpenRouter
Great! I can confirm this work around works. Thanks!
https://github.com/Pythagora-io/gpt-pilot/wiki/Using-GPT-Pilot-with-Anthropic-models