Jack Collins
Jack Collins
> When using tools, Claude will often show its “chain of thought”, i.e. the step-by-step reasoning it uses to break down the problem and decide which tools to use. The...
I'm open to something like that. I would make it unaware of the order/presence of the sections so that OpenAI models which return _either_ text or tool call also work...
Hi @ashwin153 That makes sense! The existing `RetryChatModel` is focussed on handling errors from the LLM output / schema validation rather than service-level issues like network connectivity errors, invalid API...
Hi @ashinwz Currently there isn't a way to do this in magentic. What do you put in the extra headers? I would accept a PR to add this as an...
@CiANSfi I haven't used the OpenAI Assistants API yet or put much thought into how magentic might use it. If there's a specific place you think magentic could help I'd...
Borrowing from @mnicstruwig 's example from here https://github.com/jackmpcollins/magentic/issues/151#issuecomment-2014826557 Using litellm ` 1.33.8` It looks like part of the problem is inconsistency in Claude's responses, and part is litellm's parsing of...
With https://github.com/jackmpcollins/magentic/releases/tag/v0.24.0 or earlier this should be working as expected because the response is not streamed so can be viewed in full when parsing. But with the new streaming approach...
Should be resolved by `StreamedResponse` in https://github.com/jackmpcollins/magentic/releases/tag/v0.34.0 Please let me know if you find any issues with this! --- Example from there ```python from magentic import prompt, FunctionCall, StreamedResponse, StreamedStr...
Absolutely should enable turning this on strict mode for function calling. I think it should follow the openai default of off, and allow turning it on by setting `strict =...
On closer look, `strict` is set on each individual function/tool so it might be better to allow it be set on individual tools in magentic as well. For functions this...