OpenRouter API setting does not work
In the section OpenRouter.ai API Settings, I set an API Key and model. However, when chatting with a long note, it did not work. The output given by Copilot is empty. However, I can see three requests to OpenRouter on its website and the costs are deducted. I also the following messages repeated by three times in the console:
Failed to calculate number of tokens, falling back to approximate count Error: Unknown model
at getEncodingNameForModel (plugin:copilot:19507:13)
at encodingForModel (plugin:copilot:19655:22)
at ProxyChatOpenAI.getNumTokens (plugin:copilot:20175:36)
at eval (plugin:copilot:86391:36)
at Array.map (<anonymous>)
at ProxyChatOpenAI.getNumTokensFromMessages (plugin:copilot:86389:57)
at ProxyChatOpenAI.getEstimatedTokenCountFromPrompt (plugin:copilot:86348:30)
at ProxyChatOpenAI._generate (plugin:copilot:86300:43)
at async Promise.allSettled (index 0)
at async ProxyChatOpenAI._generateUncached (plugin:copilot:85542:21)
I also tried to set the following:
However, the long note chat did not work either. On the other hand, in pure chat mode, it works as expected.
From the error I see "unknown model". You must have the correct model name from OpenRouter.
I tested openrouter in openai proxy base URL as well, it works for me. But note that whenever you use this, you must put your API key in the OpenAI API key field, and pick any OpenAI model from the picker as a placeholder (the real model used is OpenAI Proxy Model Name).