🚀 Feature: Add LLM_IS_STREAMING to span attr for langchain instrumentation
Which component is this feature for?
Langchain Instrumentation
🔖 Feature description
The attr LLM_IS_STREAMING is already used in instrumentations for ollama, openai and groq, but is not utilized in langchain instrumentation. Considering that LangChain also has extensive streaming scenarios when calling LLM models, we should also add this it to span attribute.
🎤 Why is this feature needed ?
As stated above.
✌️ How do you aim to achieve this?
Set this attribute in on_llm_new_token() since it's only triggered in streaming request.
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- [x] I checked and didn't find similar issue
Are you willing to submit PR?
Yes I am willing to submit a PR!
Can I work on this @minimAluminiumalism ?
Can I work on this @minimAluminiumalism ?
I really appreciate the offer to help, but I've already started on this one. Maybe you can try https://github.com/traceloop/openllmetry/issues/3325? Thanks!