potpie icon indicating copy to clipboard operation
potpie copied to clipboard

Add support for Ollama

Open dhirenmathur opened this issue 1 year ago • 3 comments

Overview Add support for Ollama to enable users to run open source models locally. This feature is required for privacy focused users who do not want to share their code with LLM providers.

Requirements

  • Integration with Ollama API using langchain
  • Support for multiple open source models available through Ollama
  • Seamless switching between cloud and local models using the providers API

Technical Details

  • Implement Ollama API client
  • Add configuration options for Ollama endpoint and model selection
  • Ensure compatibility with existing application interfaces

Success Criteria

  • Users can run local models through Ollama
  • Successful Knowledge graph creation and agent execution.

dhirenmathur avatar Nov 22 '24 14:11 dhirenmathur

@dhirenmathur I would like to work on this, can you assign it to me.

waveywaves avatar Nov 29 '24 18:11 waveywaves

@waveywaves are you working on this or can I assign it to someone else?

dhirenmathur avatar Dec 24 '24 06:12 dhirenmathur

@dhirenmathur hey Dhiren! continuing work on this... shall post an update through the next day

waveywaves avatar Dec 26 '24 12:12 waveywaves