aws-genai-llm-chatbot
aws-genai-llm-chatbot copied to clipboard
Show tokens consumption and estimated cost for chat and multi-chat playground
Currently, there is no indication about the cost related to the call of an LLM. It would be interesting to understand how much the message affects the cost of invoking a LLM but it would be even more interesting when comparing models one to another. In this picture, the cost could be displayed at the bottom, but it could also be part of the metadata section.
Great idea, @brnaba-aws!
- Do you think it would be useful to toggle this "cost insights" feature off and on in the settings?
- Do you think it would be useful to show both the individual message cost (like in your awesome mockup) and the overall conversation cost?
- Do you have any favorite frameworks/libraries that you'd recommend for implementing components of this feature?
Please feel free to contribute a draft pull request if you'd like to work on this -- all contributions welcome! And I'm ready to support you in that process.
This issue is stale because it has been open for 60 days with no activity.
This issue was closed because it has been inactive for 30 days since being marked as stale.