openllmetry icon indicating copy to clipboard operation
openllmetry copied to clipboard

🚀 Feature: Add LLM_IS_STREAMING to span attr for langchain instrumentation

Open minimAluminiumalism opened this issue 2 months ago • 2 comments

Which component is this feature for?

Langchain Instrumentation

🔖 Feature description

The attr LLM_IS_STREAMING is already used in instrumentations for ollama, openai and groq, but is not utilized in langchain instrumentation. Considering that LangChain also has extensive streaming scenarios when calling LLM models, we should also add this it to span attribute.

🎤 Why is this feature needed ?

As stated above.

✌️ How do you aim to achieve this?

Set this attribute in on_llm_new_token() since it's only triggered in streaming request.

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • [x] I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

minimAluminiumalism avatar Oct 13 '25 12:10 minimAluminiumalism

Can I work on this @minimAluminiumalism ?

nikhilmantri0902 avatar Oct 14 '25 11:10 nikhilmantri0902

Can I work on this @minimAluminiumalism ?

I really appreciate the offer to help, but I've already started on this one. Maybe you can try https://github.com/traceloop/openllmetry/issues/3325? Thanks!

minimAluminiumalism avatar Oct 14 '25 12:10 minimAluminiumalism