butterfish icon indicating copy to clipboard operation
butterfish copied to clipboard

[Feature Request] Enable using other AI sources as a first class citizen

Open jtslear opened this issue 1 year ago • 1 comments

Problem Statement

It generally appears that butterfish provides an array of options to boost it's usage of OpenAI. This can easily be seen by the hard coding of models and various token options: https://github.com/bakks/butterfish/blob/2a69b7d06c737a7ab17b82035f2b851ddd4426bb/butterfish/common.go

Should a user want to avoid using OpenAI, we run into a few issues, such as not being able to configure the same token configurations. I think it would be neat if we enabled butterfish to be more friendly to those that want to run models locally. This would be especially useful when creating your own or if using OpenAI is not desired for any reason.

To be clear, I'm not asking that butterfish run a model, rather, be more flexible in its config when reaching out to another locally run service which processes the requests.

jtslear avatar May 16 '24 18:05 jtslear

Problem Statement

It generally appears that butterfish provides an array of options to boost it's usage of OpenAI. This can easily be seen by the hard coding of models and various token options: https://github.com/bakks/butterfish/blob/2a69b7d06c737a7ab17b82035f2b851ddd4426bb/butterfish/common.go

Should a user want to avoid using OpenAI, we run into a few issues, such as not being able to configure the same token configurations. I think it would be neat if we enabled butterfish to be more friendly to those that want to run models locally. This would be especially useful when creating your own or if using OpenAI is not desired for any reason.

To be clear, I'm not asking that butterfish run a model, rather, be more flexible in its config when reaching out to another locally run service which processes the requests.

I second this, would like to run local LLM models with ollama for butterfish

januxnet avatar Oct 26 '24 23:10 januxnet