Feature Request: Add Langfuse Tracing
Summary
Add tracing/observability to OpenCode using Langfuse.
Why
- Currently no way to trace execution or monitor performance
- Difficult to debug issues in production
- Missing insights on usage patterns and bottlenecks
Proposal
Integrate Langfuse (https://langfuse.com/) for tracing, which is already supported by Goose (https://block.github.io/goose/) per their documentation.
Benefits
- Simple debugging
- Performance monitoring
- Usage analytics
- Low implementation effort
langfuse supports Otel, so preferably we could instrument with Otel and have langfuse as one of N many different OtelProcessors
https://langfuse.com/docs/opentelemetry/get-started
👍🏼 to this. Would love to be able to opt-in to emitting native instrumentation for as much of the stack as possible simply by adding the OTEL exporter env vars + leveraging the GenAI semantic conventions I'd be happy to assist on this as well.
+1 to having tracing/observability, especially with the release of the SDK
Following https://github.com/sst/opencode/pull/4978, Langfuse tracing is now possible with a minimal plugin that can look something like
/**
* Langfuse OpenTelemetry Plugin for OpenCode
*
* Requires: bun add @langfuse/otel @opentelemetry/sdk-node
* Env vars: LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_HOST
*/
import { NodeSDK } from "@opentelemetry/sdk-node"
import { LangfuseSpanProcessor } from "@langfuse/otel"
let initialized = false
export const LangfusePlugin = async () => ({
async config(config) {
if (!config.open_telemetry || initialized) return
initialized = true
const processor = new LangfuseSpanProcessor()
new NodeSDK({ spanProcessors: [processor] }).start()
// // Flush frequently - Required in headless mode to make sure flushes are made before process exit
// setInterval(() => processor.forceFlush(), 500)
},
})
(Probably worth finding a better solution for headless mode)
Following #4978, Langfuse tracing is now possible with a minimal plugin that can look something like
/** * Langfuse OpenTelemetry Plugin for OpenCode * * Requires: bun add @langfuse/otel @opentelemetry/sdk-node * Env vars: LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_HOST */ import { NodeSDK } from "@opentelemetry/sdk-node" import { LangfuseSpanProcessor } from "@langfuse/otel" let initialized = false export const LangfusePlugin = async () => ({ async config(config) { if (!config.open_telemetry || initialized) return initialized = true const processor = new LangfuseSpanProcessor() new NodeSDK({ spanProcessors: [processor] }).start() // // Flush frequently - Required in headless mode to make sure flushes are made before process exit // setInterval(() => processor.forceFlush(), 500) }, })(Probably worth finding a better solution for headless mode)
Is it possible to record telemetry on a per user level with this plugin? if not how to trace user level tracing in langfuse