@effect/ai-openai schema incorrect for some OpenRouter models
What version of Effect is running?
3.14.6
What steps can reproduce the bug?
// Set apiUrl to 'https://openrouter.ai/api/v1'
yield* openAI.client.createChatCompletion({
model: 'deepseek/deepseek-r1-distill-llama-70b:free', // model from OpenRouter
messages: [
{
role: 'user',
content: 'Why isn't this working?',
},
],
});
What is the expected behavior?
The effect should result in response.
What do you see instead?
Struct (Encoded side) <-> Struct (Type side)
└─ Encoded side transformation failure
└─ Struct (Encoded side)
└─ ["choices"]
└─ ReadonlyArray<{
readonly finish_reason: "stop" | "length" | "tool_calls" | "content_filter" | "function_call";
readonly index: Int;
readonly message: (Struct (Encoded side) <-> Struct (Type side));
readonly logprobs: {
readonly content: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
readonly refusal: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
} | null;
}>
└─ [0]
└─ {
readonly finish_reason: "stop" | "length" | "tool_calls" | "content_filter" | "function_call";
readonly index: Int;
readonly message: (Struct (Encoded side) <-> Struct (Type side));
readonly logprobs: {
readonly content: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
readonly refusal: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
} | null;
}
└─ ["logprobs"]
└─ {
readonly content: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
readonly refusal: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
readonly top_logprobs: ReadonlyArray<{
readonly token: string;
readonly logprob: number;
readonly bytes: ReadonlyArray<Int> | null;
}>
}> | null;
} | null
├─ Expected {
readonly content: ...;
readonly refusal: ...;
}, actual -0.0000017881393
└─ Expected null, actual -0.0000017881393
Additional information
For some OpenRouter models, logprobs return number as well.
Also in response, some model responds with no id which matches the Response schema from the OpenRouter (https://openrouter.ai/docs/api-reference/chat-completion)
Does OpenRouter claim to have compatibility with the OpenAI API? Using the OpenAI provider integration implies that the shape of the response conforms to the OpenAPI specification for the OpenAI API.
I'm just assuming, but the openai package probably works because it does not perform any validation on the shape of the response like we do.
We would probably need to create a separate provider integration for OpenRouter if they do not have OpenAI compatibility.
For example, from the OpenAI OpenAPI specification:
logprobs:
and id is a required property.
Does OpenRouter claim to have compatibility with the OpenAI API? Using the OpenAI provider integration implies that the shape of the response conforms to the OpenAPI specification for the OpenAI API.
I'm just assuming, but the
openaipackage probably works because it does not perform any validation on the shape of the response like we do.We would probably need to create a separate provider integration for OpenRouter if they do not have OpenAI compatibility.
Yes, probably openai package does not perform any validation, I tried, deepseek model (not from openrouter), and got same error when using with @effect/ai-openai.
One solution would be possible that, ability to turn off/on schema validation or a separate integration for OpenRouter
Turning off schema validation would not be a solution, it would simply make it unsafe to use as the type will be different from the runtime value
This is a problem with OpenRouter. They had the same issue with Gemini models and I pointed it out, and they fixed it.
They should be 100% OpenAI compatible.
Go to their Discord and complain.
Closing as we now have an openrouter provider package.