[Feature Request] During streaming, return usage
Required prerequisites
- [x] I have searched the Issue Tracker and Discussions that this hasn't already been reported. (+1 or comment there if it has.)
- [x] Consider asking first in a Discussion.
Motivation
If we stream a response, the usage is not returned.
Even after enabling usage via
model_config_dict={
"stream": True,
"stream_options": {"include_usage": True},
},
the input tokens and output token counts are not returned. Testing using gemini api directly do support input and output token counts
Only a total tokens is returned which is not enough for billing purposes as input token price and output token price are not the same
Solution
Support returning input token and output token count for streamed responses
Alternatives
Well there is not alternative really. Just have to support that
Additional context
No response
thanks @hzzhyj for reporting this issue,its because the structure gemini returned is different with openai,will adapt it.
will fixed in https://github.com/camel-ai/camel/pull/3326