Chatbook
Chatbook copied to clipboard
Selecting LLM through custom functions
Dear developer,
The feature of Wolfram language is to efficiently execute tasks through encapsulated functions, so I think you can implement custom LLMs in this way:
a unified wolfram function: LLM["system","prompt","messsage","image","model"]
No matter what type of LLMs users use, simply package them in such a form that they can directly connect to the Wolfram Notebook.
For example, ChatGPT[......], the part we input in the notebook corresponds to "prompt", and "model" can be set to "gpt-4" or "gpt-3.5".
This approach is similar to built-in functions "LLMSynthesize", but it offers unlimited possibilities, including wide compatibility with local LLMs.