pydantic-ai
pydantic-ai copied to clipboard
Ollama vs. OpenAI: Shouldn't we just have OpenAI support (as Ollama supports OpenAI)?
Hey all,
first, thanks for the great lib! :-)
Looking at https://ai.pydantic.dev/api/models/openai/ and https://ai.pydantic.dev/api/models/ollama/ I am wondering why not just have the OpenAI one and provide e.g. a base URL property?
As Ollama implements the OAI API, this seems like a streamlined way to do. Then we can use OpenAI() for both real OpenAI and local / remote inference engines like Ollama.
Thoughts?
i yess thats what I was talking about.
I think it's worth having a dedicated model, even if it's just setting the base url.
Also, look like we'll need more specific support for Ollama, see #242.
Then we should at least have the base URL setting in the OpenAI model - OK @samuelcolvin?