Right now all the functions are dumped into one big object. This can get a bit messy.
Proposal
Use ai-actions to store the tools.
Benefits
- No huge functions object in
render. Split up tools like you would split up components. - Only define types once. In the current code, we redefine types for all the components. With
ai-actionsyou can use theActionInputstype to infer the input of the action.
More Benefits
All though I didn't include it in this PR to keep the code somewhat similar, there are a bunch more benefits that come with ai-actions
- Type safe few shot
- Custom Metadata
- Compatible with
createStreamableUIwith Context - Output Schemas
And a lot more!
Full Example
I made a full example here of more benefits of ai-actions, including using @ to bring in tools and a basic code interpreter with the tools.
https://github.com/IdoPesok/next-ai-rsc-actions
Conclusion
I totally understand if this doesn't get merged. I do strongly believe that this library adds missing pieces to the AI SDK and does not conflict in any way with how it currently stands. I hope others find value from this like we have.
@IdoPesok is attempting to deploy a commit to the Uncurated Tests Team on Vercel.
A member of the Team first needs to authorize it.
Hi, full disclosure I created ai-actions for this exact use case. Happy to discuss what a refactor would look like without it.
I quickly wrote some TS code
type TFunctions = Parameters<typeof render>[0]['functions']
const createTool = (aiState: ReturnType<typeof getMutableAIState>) =>
({
description:
'List funny imaginary events between user highlighted dates that describe stock activity.',
parameters: z.object({
events: z.array(
z.object({
date: z
.string()
.describe('The date of the event, in ISO-8601 format'),
headline: z.string().describe('The headline of the event'),
description: z.string().describe('The description of the event')
})
)
}),
render: async function* ({ events }) {
yield (
<BotCard>
<EventsSkeleton />
</BotCard>
)
await sleep(1000)
aiState.done({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: nanoid(),
role: 'function',
name: 'getEvents',
content: JSON.stringify(events)
}
]
})
return (
<BotCard>
<Events props={events} />
</BotCard>
)
}
}) as const satisfies NonNullable<TFunctions>[string]
// also tried
// }) as NonNullable<TFunctions>[string]
To move the tools to other files, they would need to be returned in functions so we can pass the aiState to the render's scope. We would then need to cast it as some sort of value in the render args. This casting is pretty janky ATM.
The problem is now we lose type safety for the parameters (which is the whole benefit of render in the first place). I have tried many things to preserve the type-safety but I could not figure out yet how to keep the type safety with the current types of the render param in the Vercel AI SDK. * Not saying its impossible, maybe haven't given it enough thought yet.
We also would have to do createTool(aiState), createTool(aiState), etc which IMO is not that clean.
The benefit of ai-actions is that we keep the type safety with the refactor.
Hi, full disclosure I created
ai-actionsfor this exact use case. Happy to discuss what a refactor would look like without it. ....To move the tools to other files, they would need to be returned in functions so we can pass the
aiStateto the render's scope. We would then need to cast it as some sort of value in the render args. This casting is pretty janky ATM.The problem is now we lose type safety for the parameters (which is the whole benefit of render in the first place). I have tried many things to preserve the type-safety but I could not figure out yet how to keep the type safety with the current types of the
renderparam in the Vercel AI SDK. * Not saying its impossible, maybe haven't given it enough thought yet.We also would have to do createTool(aiState), createTool(aiState), etc which IMO is not that clean.
The benefit of
ai-actionsis that we keep the type safety with the refactor.
I understand what you mean, but you likely have a better chance getting the library merged as part of the Vercel ai SDK than this template.
Hi, full disclosure I created
ai-actionsfor this exact use case. Happy to discuss what a refactor would look like without it.I quickly wrote some TS code
type TFunctions = Parameters<typeof render>[0]['functions'] const createTool = (aiState: ReturnType<typeof getMutableAIState>) => ({ description: 'List funny imaginary events between user highlighted dates that describe stock activity.', parameters: z.object({ events: z.array( z.object({ date: z .string() .describe('The date of the event, in ISO-8601 format'), headline: z.string().describe('The headline of the event'), description: z.string().describe('The description of the event') }) ) }), render: async function* ({ events }) { yield ( <BotCard> <EventsSkeleton /> </BotCard> ) await sleep(1000) aiState.done({ ...aiState.get(), messages: [ ...aiState.get().messages, { id: nanoid(), role: 'function', name: 'getEvents', content: JSON.stringify(events) } ] }) return ( <BotCard> <Events props={events} /> </BotCard> ) } }) as const satisfies NonNullable<TFunctions>[string] // also tried // }) as NonNullable<TFunctions>[string]To move the tools to other files, they would need to be returned in functions so we can pass the
aiStateto the render's scope. We would then need to cast it as some sort of value in the render args. This casting is pretty janky ATM.The problem is now we lose type safety for the parameters (which is the whole benefit of render in the first place). I have tried many things to preserve the type-safety but I could not figure out yet how to keep the type safety with the current types of the
renderparam in the Vercel AI SDK. * Not saying its impossible, maybe haven't given it enough thought yet.We also would have to do createTool(aiState), createTool(aiState), etc which IMO is not that clean.
The benefit of
ai-actionsis that we keep the type safety with the refactor.
https://github.com/vercel/ai-chatbot/pull/313
Hey there! Thank you for the suggestion. While we aren't going to merge this, we appreciate you opening the PR and providing all of this context 😄