openllmetry
openllmetry copied to clipboard
🚀 Feature: Support Prompt Caching
Which component is this feature for?
Anthropic Instrumentation
🔖 Feature description
Support logging of prompt caching, including cached token usage:
https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching#how-can-i-track-the-effectiveness-of-my-caching-strategy
🎤 Why is this feature needed ?
✌️ How do you aim to achieve this?
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- [X] I checked and didn't find similar issue
Are you willing to submit PR?
None
Hi @nirga, can I work on this task?
Yes! Thank you :) @samsmithspace
Hi @samsmithspace are you still working on it?
Hi @nirga I have updated the prompt caching. Could you please check?
is it still open for contribution?
@dinmukhamedm this is fixed with #2175 right?