litellm
litellm copied to clipboard
[Bug]: Langfuse HTTP headers never reach Langfuse API
What happened?
A continuation of #7604.
I am trying to send the header langfuse_trace_user_id.
It works fine when going directly to the LiteLLM server:
curl --location --request POST 'http://192.168.1.114:4000/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer sk-xxx' \
--header 'langfuse_trace_user_id: user-id3' \
--data '{
"model": "gpt-4o",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'
But doesn't when going through nginx:
curl --location --request POST 'https://192.168.1.114:443/chat/completions' -k \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer sk-xxx' \
--header 'x_langfuse_trace_user_id: user-id4' \
--data '{
"model": "gpt-4o",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'
This doesn't work either:
curl --location --request POST 'https://192.168.1.114:443/chat/completions' -k \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer sk-xxx' \
--header 'x-langfuse-trace-user-id: user-id4' \
--data '{
"model": "gpt-4o",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'
I have tried underscores_in_headers on;
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
Latest from Docker
Twitter / LinkedIn details
No response
Wait, it's sort of working for Librechat to LiteLLM.
version: 1.2.8
cache: true
endpoints:
custom:
- name: "Self-Hosted"
apiKey: "sk-xxx"
baseURL: "https://llm.xxx.com/v1"
models:
default: ["gpt-4o"]
fetch: true
titleConvo: true
summarize: false
headers:
x_langfuse_trace_user_id: "LibreChat {{LIBRECHAT_USER_EMAIL}}"
It logged about 5 requests but stopped working after I restarted the container. I can't get it working again
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Hey wait I want it open