workerd icon indicating copy to clipboard operation
workerd copied to clipboard

Improve Workers AI types

Open Cherry opened this issue 1 year ago • 7 comments

Today, the Workers AI types rely heavily on function overloads to specify arguments for different models. This unfortunately results in very difficult to debug types, and poor DX.

As an example with a more simplified non-AI class: https://tsplay.dev/Nan9ym

Throughout this during development, you get very little assistance with auto-complete, and the errors you get back, are extremely complex and hard to parse:

No overload matches this call.
  Overload 1 of 3, '(fruit: "apple" | "orange", color: "apple" | "orange", options: { foo: boolean; bar: boolean; }): Promise<void>', gave the following error.
    Argument of type '"lemon"' is not assignable to parameter of type '"apple" | "orange"'.
  Overload 2 of 3, '(fruit: "tomato" | "pear", color: "tomato" | "pear", options: { foo: string; bar?: boolean | undefined; }): Promise<void>', gave the following error.
    Argument of type '"lemon"' is not assignable to parameter of type '"tomato" | "pear"'.
  Overload 3 of 3, '(fruit: "lemon" | "lime", color: "lemon" | "lime", options: { foo?: string | undefined; bar: number; somethingElse: boolean; }): Promise<void>', gave the following error.
    Type 'null' is not assignable to type 'number'.

Let's use some real AI examples. These can all be found in https://tsplay.dev/wQB41N - scroll down to line 255 for examples.

Invalid model names

AI.run('@hf/thebloke/neural-chat-7b-v3-awq', {
    messages: [
        {
            role: "system",
            content: `[very large system prompt]`
        },
        {
            role: "user",
            content: `How many pounds of food does a person eat in a year?`
        }
    ],
    stream: false
});

This errors with:

No overload matches this call.
  The last overload gave the following error.
    Argument of type '"@hf/thebloke/neural-chat-7b-v3-awq"' is not assignable to parameter of type '"@cf/unum/uform-gen2-qwen-500m" | "@cf/llava-hf/llava-1.5-7b-hf"'.

This is because I typo'd the model name and it should be @hf/thebloke/neural-chat-7b-v3-1-awq.

Bad inputs

AI.run('@hf/thebloke/neural-chat-7b-v3-1-awq', {
    message: [
        {
            role: "system",
            content: `[very large system prompt]`
        },
        {
            role: "user",
            content: `How many pounds of food does a person eat in a year?`
        }
    ],
    stream: false
});

Here I've typo'd messages to message, but the error is still just:

No overload matches this call.
  The last overload gave the following error.
    Argument of type '"@hf/thebloke/neural-chat-7b-v3-1-awq"' is not assignable to parameter of type '"@cf/unum/uform-gen2-qwen-500m" | "@cf/llava-hf/llava-1.5-7b-hf"'.

Bad options

AI.run('@hf/thebloke/neural-chat-7b-v3-1-awq', {
    messages: [
        {
            role: "system",
            content: `[very large system prompt]`
        },
        {
            role: "user",
            content: `How many pounds of food does a person eat in a year?`
        }
    ],
    stream: false
}, {
    extraHeaders: true
});

Here I mis-typed extraHeaders to be a boolean instead of an object, and my error is still:

No overload matches this call.
  The last overload gave the following error.
    Argument of type '"@hf/thebloke/neural-chat-7b-v3-1-awq"' is not assignable to parameter of type '"@cf/unum/uform-gen2-qwen-500m" | "@cf/llava-hf/llava-1.5-7b-hf"'.

Essentially, due to the function overload complexity here, it seems that the errors surfaced to users from these types are really not that helpful. People are confused about this regularly in the Discord.

Cherry avatar May 29 '24 01:05 Cherry

I also having the same issue


	const answer = await c.env.AI.run("@cf/meta/llama-3-8b-instruct", {
		messages: [{ role: "user", content }],
	});

shows the following error

No overload matches this call.
  The last overload gave the following error.
    Argument of type '"@cf/meta/llama-3-8b-instruct"' is not assignable to parameter of type 'BaseAiImageToTextModels'

RayyanNafees avatar Jul 09 '24 14:07 RayyanNafees

What's a good way to fix this? Without completely changing the function overloading that's going on?

Dhravya avatar Jul 24 '24 16:07 Dhravya

Function overloading is always a pain, but something like:

env.AI.run<"text">(...)

and then pinning the arguments to "text" models, or "image" models, etc. might make this a little easier to pin types, and remain backwards compatible, if the generic is optional. @DaniFoldi might have some other ideas - he's more of a TS wizard than me.

Cherry avatar Jul 24 '24 16:07 Cherry

Hmm, my first guess would be giving NoInfer a go, but unfortunately it is relatively new (5.4) and I'm pretty sure workers-types aims to support 4.8+, and it is declared as type NoInfer<T> = intrinsic in lib.es5, so it's out.

I'll try to come up with an example, but our past discussion with James was about the const generic parameter he mentioned above, so models are categorized by goals/options.

DaniFoldi avatar Jul 25 '24 17:07 DaniFoldi

we used generics to fix this, I'll quickly test and send a PR soon!

Dhravya avatar Jul 25 '24 17:07 Dhravya

@Dhravya is this on a branch you could point us to? curious what your solution looks like!

brettimus avatar Aug 01 '24 15:08 brettimus

Any updates on this issue? If the fix looks like it's still a ways out, could be good to update docs somehow

brettimus avatar Aug 28 '24 06:08 brettimus

The simplest way I got around this issue was to give my messages a type.

I had something like

  const messages = [
    {
      role: 'system',
      content: `...`,
    },
   ...
  ];
  const model = '@cf/meta/llama-3-8b-instruct';
  const aiResponse = await c.env.AI.run(model, { messages });

Which resulted in this error.

By changing to const messages: RoleScopedChatInput[] = [... the error went away. Is it a perfect solution? No, but it does work in my case.

shawncarr avatar Dec 11 '24 16:12 shawncarr

The simplest way I got around this issue was to give my messages a type.

I had something like

  const messages = [
    {
      role: 'system',
      content: `...`,
    },
   ...
  ];
  const model = '@cf/meta/llama-3-8b-instruct';
  const aiResponse = await c.env.AI.run(model, { messages });

Which resulted in this error.

By changing to const messages: RoleScopedChatInput[] = [... the error went away. Is it a perfect solution? No, but it does work in my case.

likewise if you type the input params object like as BaseAiTextGeneration["inputs"]

brettimus avatar Dec 11 '24 18:12 brettimus

Related,

const stream = await this.env.AI.run("@cf/mistralai/mistral-small-3.1-24b-instruct",
  { prompt: text, stream: true, },
);

is making the return type be ReadableStream<any>, when at runtime it's a ReadableStream<Uint8Array>.

tv42 avatar Jul 09 '25 21:07 tv42