Jack Collins
Jack Collins
@Lawouach I've published a prerelease to test having a `.usage` attribute on `AssistantMessage`. Could you test it out and let me know if it works for your use case please....
@Lawouach This is released now in https://github.com/jackmpcollins/magentic/releases/tag/v0.26.0 Please let me know how it works for you.
Published as https://github.com/jackmpcollins/magentic/releases/tag/v0.25.0a0 for testing
OpenTelemetry supported now with release of https://github.com/jackmpcollins/magentic/releases/tag/v0.28.0 . See docs page https://magentic.dev/logging-and-tracing/ @patcher9 Thanks for discussing the implementation of this with me previously 🙏 I hope with the OTEL compatibility...
Gemini docs on openai compatibility https://ai.google.dev/gemini-api/docs/openai
Tools are now supported by Ollama and they have an openai-compatible API so it should be possible to use Ollama via `OpenaiChatModel` by setting the `base_url`. https://ollama.com/blog/tool-support EDIT: Need to...
Relevant Ollama github issues - https://github.com/ollama/ollama/issues/5796 - https://github.com/ollama/ollama/issues/5989 - https://github.com/ollama/ollama/issues/5993
Hi @igor17400 , the issue is that ollama does not currently parse the tool calls from the streamed response. So the model output in that error log ``` ‘{“name”: “return_list_of_subquestion”,...
@benwhalley @igor17400 Ollama works now with magentic https://github.com/jackmpcollins/magentic/releases/tag/v0.33.0 just released, via `OpenaiChatModel`! Depending on the model you choose it might have trouble adhering to the function schema so I recommend...
@piiq Should do! Using `SystemMessage`. https://magentic.dev/chat-prompting/