effect icon indicating copy to clipboard operation
effect copied to clipboard

@effect/ai-openai schema incorrect for some OpenRouter models

Open denishsharma opened this issue 8 months ago • 4 comments

What version of Effect is running?

3.14.6

What steps can reproduce the bug?

// Set apiUrl to 'https://openrouter.ai/api/v1'

yield* openAI.client.createChatCompletion({
  model: 'deepseek/deepseek-r1-distill-llama-70b:free', // model from OpenRouter
  messages: [
    {
      role: 'user',
      content: 'Why isn't this working?',
    },
  ],
});

What is the expected behavior?

The effect should result in response.

What do you see instead?

Struct (Encoded side) <-> Struct (Type side)
└─ Encoded side transformation failure
   └─ Struct (Encoded side)
      └─ ["choices"]
         └─ ReadonlyArray<{
              readonly finish_reason: "stop" | "length" | "tool_calls" | "content_filter" | "function_call";
              readonly index: Int;
              readonly message: (Struct (Encoded side) <-> Struct (Type side));
              readonly logprobs: {
                readonly content: ReadonlyArray<{
                  readonly token: string;
                  readonly logprob: number;
                  readonly bytes: ReadonlyArray<Int> | null;
                  readonly top_logprobs: ReadonlyArray<{
                    readonly token: string;
                    readonly logprob: number;
                    readonly bytes: ReadonlyArray<Int> | null;
                  }>
                }> | null;
                readonly refusal: ReadonlyArray<{
                  readonly token: string;
                  readonly logprob: number;
                  readonly bytes: ReadonlyArray<Int> | null;
                  readonly top_logprobs: ReadonlyArray<{
                    readonly token: string;
                    readonly logprob: number;
                    readonly bytes: ReadonlyArray<Int> | null;
                  }>
                }> | null;
              } | null;
            }>
            └─ [0]
               └─ {
                    readonly finish_reason: "stop" | "length" | "tool_calls" | "content_filter" | "function_call";
                    readonly index: Int;
                    readonly message: (Struct (Encoded side) <-> Struct (Type side));
                    readonly logprobs: {
                      readonly content: ReadonlyArray<{
                        readonly token: string;
                        readonly logprob: number;
                        readonly bytes: ReadonlyArray<Int> | null;
                        readonly top_logprobs: ReadonlyArray<{
                          readonly token: string;
                          readonly logprob: number;
                          readonly bytes: ReadonlyArray<Int> | null;
                        }>
                      }> | null;
                      readonly refusal: ReadonlyArray<{
                        readonly token: string;
                        readonly logprob: number;
                        readonly bytes: ReadonlyArray<Int> | null;
                        readonly top_logprobs: ReadonlyArray<{
                          readonly token: string;
                          readonly logprob: number;
                          readonly bytes: ReadonlyArray<Int> | null;
                        }>
                      }> | null;
                    } | null;
                  }
                  └─ ["logprobs"]
                     └─ {
                          readonly content: ReadonlyArray<{
                            readonly token: string;
                            readonly logprob: number;
                            readonly bytes: ReadonlyArray<Int> | null;
                            readonly top_logprobs: ReadonlyArray<{
                              readonly token: string;
                              readonly logprob: number;
                              readonly bytes: ReadonlyArray<Int> | null;
                            }>
                          }> | null;
                          readonly refusal: ReadonlyArray<{
                            readonly token: string;
                            readonly logprob: number;
                            readonly bytes: ReadonlyArray<Int> | null;
                            readonly top_logprobs: ReadonlyArray<{
                              readonly token: string;
                              readonly logprob: number;
                              readonly bytes: ReadonlyArray<Int> | null;
                            }>
                          }> | null;
                        } | null
                        ├─ Expected {
                            readonly content: ...;
                            readonly refusal: ...;
                          }, actual -0.0000017881393
                        └─ Expected null, actual -0.0000017881393

Additional information

For some OpenRouter models, logprobs return number as well.

Also in response, some model responds with no id which matches the Response schema from the OpenRouter (https://openrouter.ai/docs/api-reference/chat-completion)

Image

denishsharma avatar Apr 04 '25 18:04 denishsharma

Does OpenRouter claim to have compatibility with the OpenAI API? Using the OpenAI provider integration implies that the shape of the response conforms to the OpenAPI specification for the OpenAI API.

I'm just assuming, but the openai package probably works because it does not perform any validation on the shape of the response like we do.

We would probably need to create a separate provider integration for OpenRouter if they do not have OpenAI compatibility.

IMax153 avatar Apr 04 '25 22:04 IMax153

For example, from the OpenAI OpenAPI specification:

logprobs:

Image

and id is a required property.

IMax153 avatar Apr 04 '25 22:04 IMax153

Does OpenRouter claim to have compatibility with the OpenAI API? Using the OpenAI provider integration implies that the shape of the response conforms to the OpenAPI specification for the OpenAI API.

I'm just assuming, but the openai package probably works because it does not perform any validation on the shape of the response like we do.

We would probably need to create a separate provider integration for OpenRouter if they do not have OpenAI compatibility.

Yes, probably openai package does not perform any validation, I tried, deepseek model (not from openrouter), and got same error when using with @effect/ai-openai.

One solution would be possible that, ability to turn off/on schema validation or a separate integration for OpenRouter

denishsharma avatar Apr 05 '25 12:04 denishsharma

Turning off schema validation would not be a solution, it would simply make it unsafe to use as the type will be different from the runtime value

mikearnaldi avatar Apr 08 '25 08:04 mikearnaldi

This is a problem with OpenRouter. They had the same issue with Gemini models and I pointed it out, and they fixed it.

They should be 100% OpenAI compatible.

Go to their Discord and complain.

afonsomatos avatar Aug 06 '25 05:08 afonsomatos

Closing as we now have an openrouter provider package.

IMax153 avatar Nov 26 '25 14:11 IMax153