ai icon indicating copy to clipboard operation
ai copied to clipboard

Support client-only use of chat library

Open jeloi opened this issue 2 years ago • 5 comments

There are applications that may not want to proxy all LLM calls through a backend server, which is a current limitation.

Specifically, the useChat hook in React assumes making a fetch call to the server, with a specific response type.

One option we've attempted is to pass in a remote url for the model provider's api directly into the useChat config - however this falls short because the decoding step is still missed (as that is assumed to happen on the server).

It would be very nice if the same abstractions around models could be used on the client - e.g passing in a config to produce a given stream.

Thank you for building this!

jeloi avatar Jun 23 '23 02:06 jeloi

its against the idea of project, and in general you need backend not to leak API keys. Adds extra layer of complexity as well as we will need local storage/login capabilities

rozenstein94 avatar Jun 25 '23 21:06 rozenstein94

Thanks for the request @jeloi, but for the reasons @rozenstein94 shared I'm going to close this as out-of-scope.

MaxLeiter avatar Jun 26 '23 23:06 MaxLeiter

Would love this as well! Perhaps this would only require some sort of middleware that let's us parse the response.

Got this far using:

const { messages, input, handleInputChange, handleSubmit } = useChat({
    headers: {
      'Content-Type': 'application/json',
      Authorization: `Bearer ${process.env.NEXT_PUBLIC_OPENAI_API_KEY}`,
    },
    api: 'https://api.openai.com/v1/chat/completions',
    body: {
      model: 'gpt-3.5-turbo',
      stream: true,
    }
  })
Screenshot 2023-07-05 at 11 02 42 AM

atgctg avatar Jul 05 '23 09:07 atgctg

I understand it not the intended intended use of the library, but wouldn’t client side only functionality be a good fit to use with a local ollama instance? How do you recommend leveraging that with the libary now?

marjoweb avatar Jul 27 '24 07:07 marjoweb

There are use cases for being able to use this without fetch, e.g. client side usage with client AI models or usage in RSCs.

For now, you can pass in a custom fetch function to replace the fetch behavior with some mock/client-only implementation.

lgrammel avatar Jul 28 '24 07:07 lgrammel