semantic-kernel
semantic-kernel copied to clipboard
Python: Add Support for Ollama SDK Connector
Add support for Ollama using the Python Ollama SDK. This is separate from Ollama support using the OpenAI Connector.
If you redo the connectors.AI.ollama, please also think about providing some basic OpenAIPromptExecutionSettings compatibility for OllamaAIPromptExecutionSettings. Currently e.g. "temperature" with OpenAIPromptExecutionSettings becomes "options"."temperature" with OllamaPromptExecutionSettings. This is e.g. breaking the SequentialPlanning settings from python/semantic_kernel/planners/sequential_planner/Plugins/SequentialPlanning/config.json. Yes, I know that Ollama uses "options", but I think exposing this difference to SK plugins,... is bad for the most common parameters like:
- temperature, top_p, stop, seed (these just need copying to "others"....)
- and also do some translation of max_tokens into num_predict, as well as frequency/presence_penalty into repeat_last_n/_penalty.