[ENHANCEMENT] Add ability to add LLM input and output token cost and add cost column
While it is great to be able to view both input and output token counts, it would be very helpful to be able to add the input and output token costs (e.g. cents/1000 tokens) and have a column in the UI that calculates the cost of the LLM call.
🤖
That sounds like a great enhancement! If you're interested in contributing to the repository, feel free to create a pull request with your proposed changes. Your contribution would be much appreciated!
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@klangst-ETR good point. Will do some research. Do you know of a good registry of model costs by chance?
I don't know of a good registry. My initial thought was to allow the user to be able to configure with model name and input and output token costs (and be able to configure multiple models). That way the user could just configure it based on the models they use.
We could use tokencost as default and sure we can ask the user to override or provide their costs.
@tvpavan thanks so much for the link!