gpt-pilot icon indicating copy to clipboard operation
gpt-pilot copied to clipboard

[Bug]: There was a problem with request to openai API: 'choices'

Open BorisMolch opened this issue 1 year ago • 6 comments

Version

VisualStudio Code extension

Operating System

Windows 10

What happened?

ENDPOINT=OPENROUTER

MODEL_NAME=anthropic/claude-3-opus

I create a new app and it seems to make a few succesfull calls, but then get this error :

There was a problem with request to openai API: 'choices'

BorisMolch avatar Mar 17 '24 23:03 BorisMolch

I have this too, Windows 11, Trying to run through LM Studio (OpenAI Prefs)

nessystudio avatar Mar 19 '24 00:03 nessystudio

This is because you are using: "http://localhost:5001/v1"

you should use: http://localhost:5001/v1/chat/completions

This choises error is because the LM studio doesnt the get request correctly and shows: [2024-03-19 20:18:18.629] [ERROR] Unexpected endpoint or method. (POST /). Returning 200 anyway

SuperMalinge avatar Mar 19 '24 19:03 SuperMalinge

Our recent release with Anthropic API support messed up OpenRouter Anthropic support. We've since made a new release with a fix. @BorisMolch @nessystudio can you please try again and see if the latest GPT Pilot release fixes it?

(if using through the Pythagora extension, it should autoupdate on start, you can check Settings to verify the latest version)

senko avatar Mar 23 '24 02:03 senko

This error is not related to LM studio on local host, or Anthropic API messing up OpenRouter.

What happens is that for whatever reason GPT-pilot strips part of the LLM model name from .evn if it contains a slash in its name.

With OpenRouter you need to declare model names with a provider prefix, like this: anthropic/claude-3-opus perplexity/pplx-70b-chat meta-llama/codellama-34b-instruct etc.

However when GPT-pilot makes an API request it strips away the provider part from the model name, so instead of sending "meta-llama/codellama-34b-instruct" it sends "codellama-34b-instruct" which results in error, model not available, like this:

There was a problem with request to openai API: API responded with status code: 404. Request token size: 23 tokens. Response text: {"error":{"message":"Model codellama-34b-instruct is not available","code":404},"user_id":"user_xxxxxxxxxxxxxxx"}

marfal avatar Apr 10 '24 09:04 marfal

Writing as model mistralai/mistralai/mixtral-8x7b-instruct worked for me with OpenRouter

MarioCodarin avatar Apr 11 '24 11:04 MarioCodarin

Writing as model mistralai/mistralai/mixtral-8x7b-instruct worked for me with OpenRouter

Great! I can confirm this work around works. Thanks!

marfal avatar Apr 12 '24 12:04 marfal

https://github.com/Pythagora-io/gpt-pilot/wiki/Using-GPT-Pilot-with-Anthropic-models

techjeylabs avatar Apr 19 '24 17:04 techjeylabs