🐛 Bug Report: Metrics wasn't sent when use chatmodel (openai , chatopenai ) from langchain
Which component is this bug for?
Langchain Instrumentation
📜 Description
When I use OpenAI or ChatOpenAI from the Langchain_OpenAI library for normal tasks, there is no issue with spans, but the metrics are not being sent to my Prometheus. However, if I use OpenAI directly, it works without any problems.
👟 Reproduction steps
It should have some metrics come out after running this code. But it didn't.
from langchain_openai import OpenAI
llm = OpenAI()
from langchain_core.prompts import PromptTemplate
prompt = PromptTemplate.from_template("How to say {input} in {output_language}:\n")
chain = prompt | llm
chain.invoke(
{
"output_language": "German",
"input": "I love programming.",
}
)
👍 Expected behavior
it should have metrics as usual.
👎 Actual Behavior with Screenshots
from above code I can't see any metrics was sent from that
🤖 Python Version
3.11
📃 Provide any additional context for the Bug.
I sent it to Prometheus
👀 Have you spent some time to check if this bug has been raised before?
- [X] I checked and didn't find similar issue
Are you willing to submit PR?
Yes I am willing to submit a PR!
Hello @ThanawaratGithub! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.
Could you provide more details about your Prometheus setup and how you have configured it to receive metrics from Langchain? Additionally, are there any specific logs or error messages that appear when you run the code?
To continue talking to Dosu, mention @dosu.
@dosu I didn't see any error messages when I run the code . I just normally use Traceloop.init() and It work with openai but not langchain.
Could you share the configuration details of your Traceloop.init() setup? Additionally, it would be helpful to see how you are integrating Traceloop with Langchain. This information will help in diagnosing why the metrics are not being sent.
To continue talking to Dosu, mention @dosu.
@dosu here Traceloop.init( app_name=os.getenv('OTEL_SERVICE_NAME'), api_endpoint=os.getenv('OTEL_EXPORTER_OTLP_ENDPOINT'), )
Could you also share how you are importing and using Traceloop in your code? Additionally, are there any specific configurations or environment variables set for Langchain that might affect the metrics?
To continue talking to Dosu, mention @dosu.
@dosu langchain==0.2.14 langchain-community==0.2.12 angchain-openai==0.1.22 langchain-core==0.2.35
Could you provide the code snippet where you are importing and using Traceloop in conjunction with Langchain? Additionally, are there any specific configurations or environment variables set for Langchain that might affect the metrics?
To continue talking to Dosu, mention @dosu.
@dosu this one not working
from traceloop.sdk import Traceloop
Traceloop.init(
app_name=os.getenv('OTEL_SERVICE_NAME'),
api_endpoint=os.getenv('OTEL_EXPORTER_OTLP_ENDPOINT'),
)
from langchain.chains import LLMChain
from langchain_openai import OpenAI
from langchain.prompts import PromptTemplate
llm = OpenAI(temperature=0.1, max_tokens=1000)
lyrics_prompt = (
"Here's a misheard lyric: {lyric}. What's the actual lyric, which song does it come from, which artist performed it, and can you give a funny "
"explanation as to why the misheard version doesn't make sense? Also, rate the creativity of the lyric on a scale of 1 to 3, where 3 is good."
)
prompt = PromptTemplate(
input_variables=["lyric"],
template=lyrics_prompt,
)
chain = LLMChain(llm=llm, prompt=prompt)
chain.invoke("print lyrics")
Could you confirm if the Traceloop initialization is being executed before any Langchain operations? Additionally, are there any logs or outputs from Traceloop that indicate it has started successfully?
To continue talking to Dosu, mention @dosu.
@dosu yes I confirm. and no error.
Thanks @ThanawaratGithub and sorry for the dosu bot 😅 It's a new tool we've been trying
Mo worry. Im having fun talking with dosu.
Could you please point me out which function is responsible for sending metrics when I use langchain? I want to contribute.
@ThanawaratGithub sorry for the delay here, I was OOO :) It's a bit complicated, we can hop on a call if needed. Basically, this is where we log stuff from langchain. And this is how metrics are created in other instrumentations, which we need to do probably here as well. Then, we can log them in the langchain callback similar to here.
Hope that makes sense :)
I'm seeing the same issue azure openai + langchain as well (from langchain_openai import AzureChatOpenAI). There are no metrics sent.
Thanks for reporting @nitin302! We're open for contributions on this and we'll try to prioritize a fix!
Could you please check this - https://github.com/traceloop/openllmetry/pull/2154#issuecomment-2418772694
@nitin302 already opened #2166 :)