aichat icon indicating copy to clipboard operation
aichat copied to clipboard

feat: support function calling

Open sigoden opened this issue 9 months ago • 8 comments

Why use LLM function calling?

LLM function calling is a powerful technique that allows Large Language Models (LLMs) to interact with external tools and data sources. This significantly enhances their capabilities and opens up exciting possibilities for various applications.

AIChat LLM Functions

AIChat implements function calling by running an external tool.

I created a new repo https://github.com/sigoden/llm-functions and write some tools that cooperate with function calls.

You can configure function calling feature according to its readme.

The client types that currently support function calling are: openai, claude, gemini and cohere.

Function calling in AIChat must work with a role or session.

A function_filter field is added to the role/session. It controls which functions are used.

AIChat provides a built-in %functions% role whose function_filter is .*, matching all functions.

In AIChat, the roles are smiliar to GPTs.

- name: myfuncs
  prompt: ''
  function_filter: func_foo|func_bar|func_.*

Function Types

Functions have two types:

  • Retrieve: Returns JSON data for further processing by the LLM.
  • Execute: Runs a command. The output is not sent back to the LLM.

image

AIChat classifies all functions starting with execute_ as "execute" type, and all others as "retrieve" type. Users must confirm the prompt before the AI chat executes a function of the "execute" type.

Conclusion

close #507

sigoden avatar May 16 '24 02:05 sigoden