fuseai icon indicating copy to clipboard operation
fuseai copied to clipboard

Utilise Dalai API for Local Llama/Alpaca AI

Open matteoveglia opened this issue 2 years ago • 2 comments

Hi, It would be great, seeing as this is a self-hosted project anyway, to have it also give the ability to connect to the dalai service created by this project: https://github.com/cocktailpeanut/dalai

The project is designed specifically for interaction with the Llama and Alpaca models, of all sizes. It has it's own web interface but it's more just for testing than actual production like ai-chat-app is.

matteoveglia avatar Mar 19 '23 00:03 matteoveglia

Thanks for sharing the tool; I didn't know it.

We could handle multiple AI APIs; I need to think of the best way to do it, though 🤔.

Maybe a configuration view, where you can specify which AI APIs you want to use and the given model parameters for the query. However, the app itself should not host the model since it's a UI to interface with AI APIs. So you would run the Llama model in a separate service and make queries via the app.

What do you think?

jimzer avatar Mar 21 '23 14:03 jimzer

Yeah absolutely, both this project and Dalai would remain separate, I think looking at your app as more of a 'model agnostic' portal to AI models would be the best thing absolutely.

The dalai tool allows for interfacing through a socket.io connection and it looks like you can do quite a bit with that so I think that's the best route

matteoveglia avatar Mar 21 '23 16:03 matteoveglia