silverbullet-ai
silverbullet-ai copied to clipboard
Better support and docs for local models
Local models are currently supported as long as they expose an openai compatible api. It'd be better to have more documentation supporting this, along with some examples of installing/configuring them. This is pretty important for #34
Two local solutions I'd focus on right now are:
- https://github.com/ollama/ollama
- https://github.com/mudler/LocalAI
litellm doesn't run models itself, but would be good to document as a proxy to access other 3rd party APIs.
LocalAI seems like it's gaining a lot of new features, but ollama should be pretty simple for anyone to run locally (if they also run silverbullet locally)