🚀 Feature: Callback Hooks for LLM Instrumentation Libraries
Which component is this feature for?
Bedrock Instrumentation
🔖 Feature description
Ability to pass a callback hook to the instrumentors which takes in the span aditiona attibutes line request and response. The callback function can update the spans with new additiona attributes
🎤 Why is this feature needed ?
It will allow us to record more metrics and add more attributes which are not already implemeted. This will allow us to extend the functionaly of the instrumentor using custom logic without waiting the functionality to be implemented in the core library.
Eg, I want the below 2 features asap
- Add a metrics that directly tracks the cost of the llm
- Bedrock converse api instrumentor does not support prompt caching tracking yet
Would love to contribute and implement these features directly, but having a callback hook will allow us to not wait for it (or any other custom attributes which does not make sense to core library, eg org_id, etc..) to be part of the core library
✌️ How do you aim to achieve this?
Not fully sure yet. Maybe something like how Django Instrumentor does it. Request and Response hooks
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- [x] I checked and didn't find similar issue
Are you willing to submit PR?
Yes I am willing to submit a PR!
@kujjwal02 sounds good - I would just add it directly in the instrumentation :)
Hi @nirga You mean the cost tracking and prompt caching feature or the hooks?
Ah I mean just implement it directly without the hooks - we're happy to support it quickly :)
Awesome, Thanks.
Any thoughts on hooks though? Happy to work a PR with a v0 implementation for bedrock instrumentor if you believe it will add value.
I'm fine with adding hooks but would love for an upstream contributiion to the prompt caching + cost tracking feature you mentioned :)
I can take up both prompt caching + cost tracking and hook and submit a PR asap (separately). prioritizing the prompt caching + cost tracking
Let me know if that works