LibreChat icon indicating copy to clipboard operation
LibreChat copied to clipboard

Enhancement: Show cost of conversation

Open Keridos opened this issue 1 year ago • 12 comments

Contact Details

No response

What features would you like to see added?

A display about how much money the calls to the API of the currently open conversation hast already cost.

More details

Some UIs show how much money the API calls of the current conversation have already cost. This project has that feature: https://github.com/Niek/chatgpt-web

Which components are impacted by your request?

No response

Pictures

No response

Code of Conduct

  • [X] I agree to follow this project's Code of Conduct

Keridos avatar Nov 24 '23 17:11 Keridos

Hi @Keridos thanks for the issue, this is planned!

danny-avila avatar Nov 26 '23 23:11 danny-avila

@danny-avila thank you! Would it be possible to make it look similar to https://github.com/Niek/chatgpt-web? That's the UI I used previously extensively, but now I'm using Librechat as I wanted to have synchronization between my devices and not just in localStorage, and seeing the token issue is definitely one of the things that I miss from chatgpt-web.

Basically:

  1. Per message token usage
  • If the model replies it shows how much tokens it used to complete this answer, with context and everything
  1. Token usage for complete conversation
  • Below the chat input you can see the total cost of the whole chat, which is the sum of all token usages per assistant prompt
  1. Token usage per model
  • What I also really like is the split between the models, so you can see which model (when you change model while conversation) uses what amount in the current conversation, this also includes image models.

Example: image

I used GPT4-1106-preview in the first one, and then I used gpt-3.5-turbo for the second prompt

jonas-w avatar Nov 27 '23 17:11 jonas-w

it's very possible @jonas-w

This is not really a priority for me but I can try to fit in my tasks for this week, as there are many things in place already that would make this relatively easy.

danny-avila avatar Nov 27 '23 22:11 danny-avila

+1, I would love to see this feature!

XHyperDEVX avatar Dec 12 '23 09:12 XHyperDEVX

+1

marlonka avatar Jan 30 '24 19:01 marlonka

+1

infused-kim avatar Apr 11 '24 05:04 infused-kim

Hey! Any update on this? If I wanted to add it in myself, how would you recommend going about it, i.e. which files to edit/create?

clearpathai avatar Apr 18 '24 11:04 clearpathai

I'd like to know how much the response cost, but also very importantly, how much all the current context will cost me so I can decide if I should try summarizing or starting a new conversation.

avimar avatar May 20 '24 13:05 avimar

@danny-avila how easy is it to implement the feature?

XHyperDEVX avatar May 20 '24 13:05 XHyperDEVX

Via Austin here's a tampermonkey script to count the previous tokens and input tokens to get an estimated cost of just the input.

I don't know how to grab the active model to get a real price, so I made it show gpt-4o and opus.

avimar avatar Jun 02 '24 16:06 avimar

I tried to get the total token count for the conversation in MessagesView.tsx but it's only adding up the tokens for messages which have the user as the sender. When the sender is the "Assistant", the tokenCount property is not set.

My attempt:

  const calculateTotalTokenCount = (children: TMessage[]) => {
    let totalTokenCount = 0;

    children.forEach((child) => {
      totalTokenCount += child.tokenCount ?? 0;
      if (child.children) {
        totalTokenCount += calculateTotalTokenCount(child.children);
      }
    });

    return totalTokenCount;
  };

  if (_messagesTree) {
    const totalTokenCount = calculateTotalTokenCount(_messagesTree);
    console.log('Total token count:', totalTokenCount);
  }

@danny-avila Can you recommend how best to go about this?

david02871 avatar Jul 28 '24 13:07 david02871

@david02871 this is a long requested feature, it's on my list for this month

danny-avila avatar Aug 05 '24 04:08 danny-avila