typebot.io icon indicating copy to clipboard operation
typebot.io copied to clipboard

Add Ollama block

Open Lanhild opened this issue 10 months ago • 2 comments

APIs like Ollama are fully compatible with the OpenAI spec, thus we should have the capabability of adding a custom endpoint in OpenAI block credentials image

Lanhild avatar Apr 02 '24 16:04 Lanhild

Ollama is local. I think that does not really make sense for a tool like Typebot which is 100% online?

What would be the use case?

baptisteArno avatar Apr 04 '24 06:04 baptisteArno

Ollama is also deployed on servers, and having this node compatible with other OpenAI based APIs would allow to use them in Typebot.

There would simply need to add an option to specify the endpoint in the block credentials.

Lanhild avatar Apr 04 '24 14:04 Lanhild

@baptisteArno Hi

Local LLMs are really important and are also a trend.

Please also add xinference, which supports more models than ollama and has more comprehensive functions.

Official documentation: https://inference.readthedocs.io/en/latest/index.html

ZimaBlueee avatar Jul 28 '24 06:07 ZimaBlueee

I don't think it is a good idea to offer the Ollama block since most people would like to consume it locally. Typebot by default is 100% online so it won't work with local ollama instances. Closing it for now.

baptisteArno avatar Aug 19 '24 07:08 baptisteArno

Hi has this been deployed or is it still not planned?

A use case is that an organisation can use local AI to allow (sales) staff to ask queries about services and staff documentation which is complex, and is updated frequently. But the data must be internal only.

I can't find another way to do this without using extensive cloud resources (and yes, I am aware of Botpress, langflow, localAI, lobechat, anythingLLM, LMStudio, ActivePieces/ n8n, AsktheDoc, Docuchat, ChatDoc, Documind and PrivateGPT etc...)

easaw avatar Aug 30 '24 00:08 easaw

@Lanhild I did think of one way to do this, not the best, but;

Change the hosts file on the machine running Typebot so that api.openai.com (or whatever) uses the IP for your Ollama instead.

Assuming all users are local, they could chat and Typebot would (should) communicate with Ollama over the local IP.

easaw avatar Aug 30 '24 00:08 easaw