Timmo

Results 25 comments of Timmo
trafficstars

> Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run....

Start oobabooga with `call python server.py --auto-devices --chat --wbits 4 --groupsize 128 --api --listen --extension openai` Then enter the API url in the model in the UI http://127.0.0.1:5001/v1 However a...

> > However a LLMs might have different Keywords like "### Instruction:" or "User:" > > Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽 This might...

Meanwhile I built my own using go and the "pack.ag/tftp" library. Basically a one-liner and works on every OS

I had the same issue with Ollama. Then I entered a valid default model and it works ![image](https://github.com/mattermost/mattermost-plugin-ai/assets/12451336/28647efc-2abd-4f72-bc3e-369018b0c035) I think in LocalAI they only return the you requested but not...