web-llm
web-llm copied to clipboard
Usage Stats in Intermediate Steps
Hello, I saw that recently the runtimeStatsText()
function might be deprecated and that now the usage metadata can be accessed with the streamOptions: { include_usage: True}
in the stream request. However, I read that this can only be accessed in the last chunk, instead of at any time such as with runtimeStatsText()
.
I was wondering if it is possible to get this metadata in the intermediate steps when streaming. In other words, to get the usage metadata when the output chunks are being streamed.
Any assistance with this will be greatly appreciated. Thank you!