opik icon indicating copy to clipboard operation
opik copied to clipboard

[Bug]: "No Data" in LLM call despite input/output data being recorded

Open LiveOverflow opened this issue 10 months ago • 2 comments

What component(s) are affected?

  • [ ] Python SDK
  • [x] Opik UI
  • [ ] Opik Server
  • [ ] Documentation

Opik version

  • Opik version: 1.5.4-1114

Describe the problem

In the example screenshot I am running a LangGraph agent and recording the trace and LLM calls with Opik.

tracer = OpikTracer(graph=app.get_graph(xray=True), tags=[model.model], project_name="Looping Hacker")
for s in graph.stream({"source_code": source_code}, config={"callbacks": [tracer]}):
    print(s)
    tracer.flush() # also tested if flushing would make the data available earlier. seems to have no effect

While the trace is running, individual LLM calls are being recorded by Opik and show up under the "LLM calls" tab, but they cannot be inspected. Only after the trace is complete the data becomes available.

I think this is a bug because the table clearly shows that LLM call data has been recorded and is available from the API. It would be very helpful to inspect the in/out LLM call details during a running trace to live inspect the agent execution.

Image

Reproduction steps and code snippets

When clicking on the LLM call details, a request is sent to get information about the trace (which is currently in progress) /api/v1/private/traces/01955680-3044-7b41-8970-316ccd7415a6 and it fails with 404 Not Found. Leading to the "No Data" display.

However other API calls fetching data on the project level still includes details about the specific LLM trace /api/v1/private/spans/stats?project_id=01955680-357c-7da0-b0e3-e08d40f87710&type=llm which could be used to fill in details about the LLM call.


{ [...], "content": [
    { [...] },
    {
        "id": "01955684-06ab-7294-b9d0-bc62eae5336b",
        "project_id": "01955680-357c-7da0-b0e3-e08d40f87710",
        "trace_id": "01955680-3044-7b41-8970-316ccd7415a6",
        "parent_span_id": "01955684-06a8-7461-8ccd-d6cb9a468a38",
        "name": "ChatOllama",
        "type": "llm",
        "start_time": "2025-03-02T11:00:47.912196Z",
        "end_time": "2025-03-02T11:00:55.726080Z",
        "input": {
            "prompts": [
                "System: You are a helpful assistant extracting [...]" // shortened
            ]
        },
        "output": {
            "generations": [
                [
                    {
                        "text": "Yes, [...]", // shortened
                        // [...]
                    }
                ]
            ],
            "llm_output": null,
            "run": null,
            "type": "LLMResult"
        },
        "metadata": {
            // [...]
        },
        "created_at": "2025-03-02T11:00:56.021973747Z",
        "last_updated_at": "2025-03-02T11:00:56.021973747Z",
        "created_by": "admin",
        "last_updated_by": "admin",
        "duration": 7813.884
    },
    { [...] },
    { [...] }
]

Error logs or stack trace

No response

Healthcheck results

No response

LiveOverflow avatar Mar 02 '25 12:03 LiveOverflow

Hi @LiveOverflow ! Thank you a lot for reporting this issue in such a great detail! We are going to take care of it and get back to you :)

aadereiko avatar Mar 10 '25 08:03 aadereiko

Related to #1610 - Will track progress there

jverre avatar Mar 24 '25 09:03 jverre

Hi! This issue should be resolved by this ticket - https://github.com/comet-ml/opik/issues/1610

Please don’t hesitate to reopen it if you’re still experiencing the same behavior.

juanferrub avatar Jun 18 '25 11:06 juanferrub