LLPhant icon indicating copy to clipboard operation
LLPhant copied to clipboard

Feed called function results back into OpenAI

Open prykris opened this issue 1 year ago • 7 comments

Since the conversation history is built using Message instances it doesn't allow me to return the called function result back into the conversation.

The error I receive: Invalid parameter: messages with role 'tool' must be a response to a preceding message with 'tool_calls'.

Route::get('/tools-test', function () {
    $config = new OpenAIConfig;
    $config->apiKey = config('services.openai.api_key');

    $chatSession = new ChatSession(
        new OpenAIChat($config),
        []
    );

    $chatSession->getChat()->addTool(FunctionBuilder::buildFunctionInfo(new class
    {
        /**
         * Returns current user's username
         */
        public function getUserName(): string
        {
            return auth()->user()->name;
        }
    }, 'getUserName'));

    dd($chatSession('What is my username?'));
});

Here is how I handle the tool call, and attempt to feed it back

public function generateResult(string $prompt): Result
    {
        $isFirstMessage = empty($this->results);
        $history = $this->buildChatHistory($prompt);

        $start = microtime(true);
        $responseOrFunction = $this->chat->generateChatOrReturnFunctionCalled($history);
        $end = microtime(true);

        $responseObject = $this->chat->getLastResponse();

        if ($responseOrFunction instanceof FunctionInfo) {
            $history[] = Message::toolResult(
                FunctionRunner::run($responseOrFunction)
            );

            $responseText = $this->chat->generateChat($history);
        } else {
            $responseText = $responseOrFunction;
        }

        $llmResult = new Result(
            new Generation($prompt, $responseText),
            new Metadata(
                [
                    'prompt_tokens' => $responseObject->usage->promptTokens,
                    'completion_tokens' => $responseObject->usage->completionTokens,
                    'total_tokens' => $responseObject->usage->totalTokens,
                ],
                $responseObject->choices[0]->finishReason,
                $this->chat->model,
                $start,
                $end,
            ),
            $isFirstMessage
        );

        $this->results[] = $llmResult;

        return $llmResult;
    }

And if I attempt to use functionCall instead, I get the following error: Missing parameter 'name': messages with role 'function' must have a 'name'.

I can't seem to make it "work" without hacking the implementation of OpenAIChat, which I DO NOT want to do. But I might have to extend and overwrite the method in order to get it working.

functions are deprecated and are replaced by tools it seems, but there yet does not seem to be a way that properly returns tool call data.

prykris avatar Sep 16 '24 11:09 prykris

https://github.com/theodo-group/LLPhant/pull/194/files

prykris avatar Sep 16 '24 11:09 prykris

#221

prykris avatar Sep 23 '24 21:09 prykris

What do you think about using the same approach that has been implemented for Anthropic? https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L70 @MaximeThoonsen what is your opinion?

f-lombardo avatar Sep 29 '24 14:09 f-lombardo

What do you think about using the same approach that has been implemented for Anthropic?

https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L70

@MaximeThoonsen what is your opinion?

While it does work, it comes with great drawback of it being stateless. We are unable to continue the conversation with "correct" history.

Could you be open to creating a stateful chat history solution while still being stateless by default?

https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L99

While I understand that responsibility falls more on the user side the current approach limits what kind of messages users can collect. I don't see why the user couldn't attach its own MessageBag (or whatever object is meant for collection). The extendability is lacking here

prykris avatar Oct 03 '24 05:10 prykris

@prykris @f-lombardo My two main goals are:

  • simplicity
  • avoid too much magic so that people understand what is happening under the hood

After that I'm quite open on a lot of stuff :).

I feel we are on an important topic as agents and "chat with function being called" will be mainstream really soon.

We can create a new chat method that is stateful and we should indeed be more flexible to handle all the use cases. @prykris , in your message bag, what did you had in mind to put in? The function called?

For the stateful chat what do we need ?

  • history of messages
  • a good mechanism to inject the return of functions being called into the prompt system
  • history of results of functions?
  • if we are RAG-opionated, the docs retrieved to answer the questions
  • prompt usage ?

Do you see anything else?

MaximeThoonsen avatar Oct 03 '24 17:10 MaximeThoonsen

Simplicity remains achievable as long as users can choose between the core implementations for chat generation. My suggestion is to apply composition over inheritance and build on that foundation.

Copy code
class PersistentChat {
    protected ChatInterface $chat;
}

My key requirements are:

  • Storing messages
  • Handling tool calls and preserving them along with their respective responses
  • I haven't implemented or experimented with RAG yet, so I can't provide feedback on that
  • Token usage and additional metadata must be associated with each message

For storing messages, the LinkedList data structure is ideal, with each node pairing a message and its metadata. This structure simplifies implementing a sliding window or querying conversation history based on token usage. It also allows for applying rules to the MessageBag, adjusting for different AI providers, and handling invalid states efficiently.

prykris avatar Oct 04 '24 06:10 prykris

The error I receive: Invalid parameter: messages with role 'tool' must be a response to a preceding message with 'tool_calls'.

I had this error while switching from ollama to openai and developing https://github.com/theodo-group/LLPhant/pull/276 so I fixed it in that PR. Hope it helps.

fballiano avatar Dec 04 '24 23:12 fballiano