langfuse-python
langfuse-python copied to clipboard
🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
Add the ensure_ascii=False parameter to all json.dumps calls to ensure non-ASCII characters (such as Chinese) are not escaped into \uXXXX format. Files to modify: langfuse/_client/attributes.py: Core serialization functions langfuse/_utils/request.py: API...
Bumps [openai](https://github.com/openai/openai-python) from 2.5.0 to 2.6.1. Release notes Sourced from openai's releases. v2.6.1 2.6.1 (2025-10-24) Full Changelog: v2.6.0...v2.6.1 Bug Fixes api: docs updates (d01a0c9) Chores client: clean up custom translations...
# Context For our gemini usage (using Langchain through VertexAI), we learned that costs for cached tokens is not correctly calculated. We traced this back to cached tokens not being...
> [!IMPORTANT] > Fixes URL encoding for prompt names in `update_prompt()` in `client.py` to ensure compatibility with `httpx` 0.28.0+. > > - **Behavior**: > - Fixes URL encoding for prompt...
When using anyio cancel scopes inside an observed async generator, we were getting `RuntimeError: Attempted to exit cancel scope in a different task than it was entered in` Because the...
Bumps [ruff](https://github.com/astral-sh/ruff) from 0.12.12 to 0.14.2. Release notes Sourced from ruff's releases. 0.14.2 Release Notes Released on 2025-10-23. Preview features [flake8-gettext] Resolve qualified names and built-in bindings (INT001, INT002, INT003)...
### Problem When a prompt label is removed in the Langfuse UI and the cached entry expires, `get_prompt` kept returning the stale prompt. The refresh worker logged a `404 NotFoundError`...
When Langfuse is enabled, the object passed to OpenAI’s stream manager as `raw_stream` is adapter LangfuseResponseGeneratorAsync. The OpenAI manager sets `self._response = raw_stream.response` and then calls `await self._response.aclose()`. `LangfuseResponseGeneratorAsync` previously...
Fix of the bug, when langfuse trace attributes was not passed to Langfuse UI, if call was inside of LLM chain https://github.com/langfuse/langfuse/issues/10380 ---- > [!IMPORTANT] > Fix bug in `CallbackHandler.py`...
When using OpenAI Agents SDK, an Omit sentinel can be sent, resulting in raising the "metadata must be a dictionary" error ---- > [!IMPORTANT] > Adds handling for `Omit` type...