Feature Request: Support Ollama Models
Summary
Currently, ADK Go only supports Gemini models through the model/gemini package. This issue proposes adding support for Ollama, an open-source platform that allows running large language models locally.
Motivation
Supporting Ollama would provide several benefits:
- Local Development: Enable developers to run agents locally without requiring API keys or internet connectivity
- Cost Efficiency: Use open-source models without API costs
- Privacy: Process sensitive data locally without sending it to external services
- Model Variety: Access a wide range of open-source models (Llama, Mistral, CodeLlama, etc.) through Ollama's unified API
- Model-Agnostic Design: Aligns with ADK's model-agnostic philosophy mentioned in the README
Current State
- ADK Go currently implements the
model.LLMinterface only for Gemini models - The
model/geminipackage provides a reference implementation - Models are created via
gemini.NewModel()and passed to agents
I would like to solve this issue
Hey, this might be a silly question, but have you tried integrating Ollama with the adk via LiteLLM example. Don't get me wrong, I'd loved to have a native support from the framework. However, I'm wondering whether you've run into any issues - such as response latency, in case you have already tried it.
checkout this https://github.com/byebyebruce/adk-go-openai