openinference
openinference copied to clipboard
[BUG] Error when using llama-index-llms-anthropic from llama index
Describe the bug When using Llama instrumenter and calling BedrockConverse, all expected spans are picked up but when switching to llama-index-llms-anthropic then LLM spans go missing.
To Reproduce Steps to reproduce the behavior:
- Instrument a Llamaindex app that uses Claude via BedrockConverse
- Then switch to llama-index-llms-anthropic
Expected behavior LLM spans with Claude are received for either class.
Environment (please complete the following information): openinference-instrumentation-llama-index Version - 3.0.2
Additional context Add any other context about the problem here (e.x. a link to a colab)
Scheduling for next sprint
removing from phoenix
@camyoung93 can you provide more information on this? i.e. code snippets?