[Bug]: How to calculate the cost of llm api token used in headless mode
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
Describe the bug and reproduction steps
I am using docker headless mode and I am setting cost related environment variables like this, -e LLM_INPUT_COST_PER_TOKEN=0.015 -e LLM_OUTPUT_COST_PER_TOKEN=0.020 . I f does not set it, i see llm.py:670 - Error getting cost from litellm: This model isn't mapped yet. model=gpt-4, custom_llm_provider=litellm_proxy. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json. if I set the above environments variables, the error will be cleared but i am not getting the cost incurred during the operation. where can i find it?
OpenHands Installation
Docker command in README
OpenHands Version
No response
Operating System
None
Logs, Errors, Screenshots, and Additional Context
No response
I am able to see the cost in the log once I enable the DEBUG environment variable
Does that resolve your question @krishgcek ?
Can I write those information to some file?
So you want the output from the command line that shows the cost of things to be written to some file? It's likely possible but I don't know if OpenHands has a thing that allows that out of the box.
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.
This issue was closed because it has been stalled for over 30 days with no activity.