sigoden
sigoden
I will research the performance issues later. We do not currently plan to support infinite depth, so we are closing this issue.
The text generated by function calls is beyond the capabilities of the LLM. Including it in session conversation will only cause interference. Note: the functions is bound to role other...
 AIChat now fully supports the function calling feature. It can retrieve information from functions and pass it back to the LLM for further processing. @tkanarsky
AIChat save the question and final result in the session. The intermediate data returned by the function call has an uncertain format and variable size, not as valuable as you...
If the LLM doesn't support parallel function calling, it will cause an infinite loop: ``` Call get_current_weather --location 'London, England' Call get_current_weather --location 'Paris, France' Call get_current_weather --location 'London, UK'...
Aichat can now reuse tool call results. LLMs that don't support parallel function calls now work.
function call is for developers, not end users. Currently the only promising form of extending model functionality is gpts. Technically, it is possible to implement a command line version of...
#514 has implemented this function, you can try it. Welcome feedback and suggestions.
I welcome you to submit a PR @gilcu3
There are two function types: 1. **Dispatch Functions:** Execute a script. 2. **Retrieve Functions:** Generate JSON data for further LLM processing. Now the question is how to distinguish them? *...