rawdog
rawdog copied to clipboard
Update readme
To use local models with ollama a sample configuration is config.yaml
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral
Very cool project