Cost calculation for the whole chat session
Hi,
am I correct that at the moment, there is no way to know the cost for a whole CustomGroupChat() session? It’s only for individual calls started with OpenAIWrapper().create()?
Also, are there ways to mitigate the OpenAI costs during a group chat session? I have trouble understanding how the group chat sessions work: do agents receive the whole history of conversation when they are called? Is there a way to restrict the context received by an agent to the last message in the chat? This would greatly help limit the number of tokens sent to OpenAI I guess.
@kevin666aa
That's basically correct yes. We are working on better instrumentation this week to better track costs. Group chat is tricky because many messages are duplicated (and thus only need to be counted once), and because there's a hidden OAI call for orchestration that does not show up as a message.
At present the entire history is passed to all agents, and is also passed to OAI when doing orchestration. The costs are super-linear -- long chats are much much more costly than short chats. You can mitigate this by using the "round_robin"... but then the orchestration is basically a simple loop with no llm calls at all. There's no way to mitigate this short of rewriting GroupChat.
HI is there any update on when this feature will be made live?
Some work is under way on customizing client that should make addressing this issue simple.