karthink
karthink
> gptel backend: Magistral by mistral.ai How did you configure this?
@zauster Could you also provide a log of the response with streaming turned on?
> I can, but on next Monday at the earliest... Okay. From what I can see it looks like Magistral doesn't follow the OpenAI-compatible API (to the extent that there's...
gptel supports reasoning content from Ollama now (PR #1120). > But one thing that's missing is that both for [OpenAI](https://platform.openai.com/docs/guides/reasoning/advice-on-prompting#keeping-reasoning-items-in-context) and [Ollama](https://docs.ollama.com/capabilities/tool-calling#tool-calling-with-streaming) it's recommended to send back the reasoning with...
Looking into this some more, the doc is about OpenAI's responses API, which gptel does not support.
> I think this is API agnostic. The idea may be API agnostic, but the implementation is not. From what I've tried reasoning does not work at all with OpenAI's...
I'm not able to reproduce it, adding and removing tools works fine here. Can you give me any other details to reproduce it? You can also record a short screencast...
Should be fixed, please update and test with a tool that has no `:category` value.
Closing as completed, please reopen if the bug persists.
If possible, I would rather find a way to get diff-mode to apply the hunks to buffers. I'll look into it, but if you've studied this already you can let...