docs
docs copied to clipboard
Document strategies to handle invalid tool input
Describe the issue or suggestion
A LLM can provide invalid input when request tool calls. In this discussion with Stephen, several strategies were mentioned to deal with that.
The documentation could be improved with:
FunctionInvokingChatClient.IncludeDetailedErrorsto send serialization (and other) exceptions back to the LLMFunctionInvokingChatClient.FunctionInvokerto, for example, catch serialization exceptions. It could also be interesting to document that it can be used to implement a retry mechanism, like throwing aMyRetryExceptionfrom a tool (because of incoherent input or something), catching it in theFunctionInvokerand returning a message to make the LLM retry "invalid something, please retry". Sharing good prompts that make some models retry a tool call would be a plus.- strictJsonSchema special property for OpenAI