gpt-pilot
gpt-pilot copied to clipboard
Fix OPENROUTER problem with the model names
If you try to use OpenRouter with any model - you got error:
{"error":{"message":"Model llama-3-70b-instruct:nitro is not available","code":404}, ...
because openrouter need's to have provider name together with the model name in one string.
This commit fix it for OpenRouter, so you can use:
MODEL_NAME="meta-llama/llama-3-70b-instruct:nitro"
in the config and it will work
i have a similar, but different experience with OpenRouter. If its not in this list: https://docs.litellm.ai/docs/providers/openrouter it will not work "out of the box" with just stating the model name. MODEL_NAME=gpt-4 - works MODEL_NAME=openai/gpt-4 - works MODEL_NAME=claude-3-opus - does not work MODEL_NAME=anthropic/claude-3-opus - does not work MODEL_NAME=openrouter/anthropic/claude-3-opus -works