Lars Grammel

Results 105 comments of Lars Grammel

@camwest do you have a github repository with a reproduction that you could share?

Currently this is not supported. I'm exploring options for adding core tool call and core tool result support to useChat.

In-progress PR: https://github.com/vercel/ai/pull/1514 - this will be a larger change, i expect it to take a few days

Thanks! You could try using the legacy providers, but then you'd need to refactor once this lands

@d-ivashchuk yes, this should be possible. you could define tools without an execute method, handle the tool calls on the client to add information to an array (or alternatively handle...

@d-ivashchuk yes, i think this is a great use case for tools. do you need to call the llm with the tool results, or are they just needed to display...

@d-ivashchuk i have implemented a slightly different version that focusses on user interactions as client-side tools. I have the sense that you might need something different, e.g. stream data, for...

Are you using es modules or commonjs? Would you mind sharing your `tsconfig.json`?

You could implement a custom provider (see https://sdk.vercel.ai/providers/community-providers/custom-providers ) that directly calls web llm. The llamacpp community provider does something similar (in process communication) I think.

AI SDK core could also be used on the client side. That said, you are right, `useChat` etc require a server connection. What you could do is use e.g. AI...