potpie
potpie copied to clipboard
Add support for Ollama
Overview Add support for Ollama to enable users to run open source models locally. This feature is required for privacy focused users who do not want to share their code with LLM providers.
Requirements
- Integration with Ollama API using langchain
- Support for multiple open source models available through Ollama
- Seamless switching between cloud and local models using the providers API
Technical Details
- Implement Ollama API client
- Add configuration options for Ollama endpoint and model selection
- Ensure compatibility with existing application interfaces
Success Criteria
- Users can run local models through Ollama
- Successful Knowledge graph creation and agent execution.
@dhirenmathur I would like to work on this, can you assign it to me.
@waveywaves are you working on this or can I assign it to someone else?
@dhirenmathur hey Dhiren! continuing work on this... shall post an update through the next day