opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Feature Request: Add Langfuse Tracing

Open tito opened this issue 6 months ago • 4 comments

Summary

Add tracing/observability to OpenCode using Langfuse.

Why

  • Currently no way to trace execution or monitor performance
  • Difficult to debug issues in production
  • Missing insights on usage patterns and bottlenecks

Proposal

Integrate Langfuse (https://langfuse.com/) for tracing, which is already supported by Goose (https://block.github.io/goose/) per their documentation.

Benefits

  • Simple debugging
  • Performance monitoring
  • Usage analytics
  • Low implementation effort

tito avatar Jun 19 '25 23:06 tito

langfuse supports Otel, so preferably we could instrument with Otel and have langfuse as one of N many different OtelProcessors

https://langfuse.com/docs/opentelemetry/get-started

ColeMurray avatar Jun 21 '25 02:06 ColeMurray

👍🏼 to this. Would love to be able to opt-in to emitting native instrumentation for as much of the stack as possible simply by adding the OTEL exporter env vars + leveraging the GenAI semantic conventions I'd be happy to assist on this as well.

adrielp avatar Jul 17 '25 14:07 adrielp

+1 to having tracing/observability, especially with the release of the SDK

mattzhango avatar Sep 07 '25 21:09 mattzhango

Following https://github.com/sst/opencode/pull/4978, Langfuse tracing is now possible with a minimal plugin that can look something like

/**
 * Langfuse OpenTelemetry Plugin for OpenCode
 * 
 * Requires: bun add @langfuse/otel @opentelemetry/sdk-node
 * Env vars: LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_HOST
 */
import { NodeSDK } from "@opentelemetry/sdk-node"
import { LangfuseSpanProcessor } from "@langfuse/otel"

let initialized = false

export const LangfusePlugin = async () => ({
  async config(config) {
    if (!config.open_telemetry || initialized) return
    initialized = true

    const processor = new LangfuseSpanProcessor()
    new NodeSDK({ spanProcessors: [processor] }).start()

    // // Flush frequently - Required in headless mode to make sure flushes are made before process exit
    // setInterval(() => processor.forceFlush(), 500)
  },
})

(Probably worth finding a better solution for headless mode)

noamzbr avatar Dec 05 '25 18:12 noamzbr

Following #4978, Langfuse tracing is now possible with a minimal plugin that can look something like

/**
 * Langfuse OpenTelemetry Plugin for OpenCode
 * 
 * Requires: bun add @langfuse/otel @opentelemetry/sdk-node
 * Env vars: LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGFUSE_HOST
 */
import { NodeSDK } from "@opentelemetry/sdk-node"
import { LangfuseSpanProcessor } from "@langfuse/otel"

let initialized = false

export const LangfusePlugin = async () => ({
  async config(config) {
    if (!config.open_telemetry || initialized) return
    initialized = true

    const processor = new LangfuseSpanProcessor()
    new NodeSDK({ spanProcessors: [processor] }).start()

    // // Flush frequently - Required in headless mode to make sure flushes are made before process exit
    // setInterval(() => processor.forceFlush(), 500)
  },
})

(Probably worth finding a better solution for headless mode)

Is it possible to record telemetry on a per user level with this plugin? if not how to trace user level tracing in langfuse

modelnova-ai avatar Dec 06 '25 07:12 modelnova-ai