tinychat icon indicating copy to clipboard operation
tinychat copied to clipboard

Support for Ollama API

Open imbev opened this issue 1 year ago • 3 comments

Ollama is an open source application that makes it very easy to use LLMs via CLI or an HTTP API. I suggest adding support for Ollama's API.

https://github.com/jmorganca/ollama/blob/main/docs/api.md

imbev avatar Jan 14 '24 03:01 imbev

Hi there @imbev !

Thanks for your input!! Yes, support for local models is definitely something that I would like to implement in future releases, right after chat history.

Regarding Ollama API is that still WSL only under Windows? What system do you use it on?

Cheers!!

pymike00 avatar Jan 14 '24 09:01 pymike00

Hello @pymike00

I currently use Ollama on Debian Linux.

It can be compiled on Windows, but the Windows version is definitely not ready yet. https://github.com/jmorganca/ollama/blob/main/docs/development.md

imbev avatar Jan 15 '24 01:01 imbev

I think in that case it's probably better to wait for proper Windows support before adding some code for it.

After all the main value of the project lies in its simplicity, I think.

Thank you for your feedback, it's much appreciated.

Feel free to post any other suggestion you may have.

Happy Coding!

pymike00 avatar Jan 15 '24 17:01 pymike00