phoenix icon indicating copy to clipboard operation
phoenix copied to clipboard

[BUG] React Data Error

Open ASCE-D opened this issue 2 months ago • 7 comments

Minified React error #31; visit https://react.dev/errors/31?args[]=object%20with%20keys%20%7Bsuccess%2C%20content%7D for the full message or use the non-minified dev environment for full errors and additional helpful warnings.

I got this error when I'm looking for traces in my local setup

ASCE-D avatar Oct 31 '25 13:10 ASCE-D

Hi @ASCE-D! I'm Dosu and I’m helping the phoenix team.

React error #31 happens when a plain object is rendered directly as a React child—React expects strings, numbers, or valid React elements, not objects. In your case, it looks like an object with keys {success, content} is being rendered somewhere in the trace UI. This type of bug has come up before in Phoenix, especially when handling trace data or playground UI, and was sometimes related to how API responses or trace objects were passed to components without proper formatting or mapping to elements. You can see a similar issue and discussion here.

To debug, switch to a non-minified development build for more detailed error messages, and use React DevTools to inspect the props and children being passed to the relevant components. Look for any place where an object is being rendered directly, and update the code to either stringify the object or map its properties to elements. If you recently updated how traces are fetched or displayed, double-check those changes for places where an object might be passed as a child.

If you can share a minimal reproduction or more details about the trace data or component, I can help pinpoint the issue further.

To reply, just mention @dosu.


How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other  Chat with Dosu Join Discord Share on X

dosubot[bot] avatar Oct 31 '25 13:10 dosubot[bot]

Thank you for the report. We'll investigate.

RogerHYang avatar Oct 31 '25 20:10 RogerHYang

@ASCE-D do you have more information about when and where the error occurs so we can reproduce? A link to the page that it occurs on, and what kind of data you are attempting to view would be helpful

cephalization avatar Oct 31 '25 20:10 cephalization

I'm getting this in the Trace Details page for a span, on the Info tab

Image

I had a similar setup working yesterday, but today on a new project this is all I get.

alex-dr avatar Nov 05 '25 19:11 alex-dr

@alex-dr can you share how you are instrumenting traces? Which sdk and model?

cephalization avatar Nov 05 '25 21:11 cephalization

@cephalization I was getting this in the local setup when I wanted check details of tool calls

Image

ASCE-D avatar Nov 06 '25 11:11 ASCE-D

    def instrument_pydantic(self):
        """Instruments pydantic AI for phoenix tracing server."""
        if not self.enabled:
            return

        from opentelemetry import trace
        from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
        from opentelemetry.sdk.trace import TracerProvider
        from openinference.instrumentation.pydantic_ai import OpenInferenceSpanProcessor
        from opentelemetry.sdk.trace.export import SimpleSpanProcessor
        from pydantic_ai import Agent
        from pydantic_ai.models.instrumented import InstrumentationSettings

        # Set up the tracer provider
        tracer_provider = TracerProvider()
        trace.set_tracer_provider(tracer_provider)

        # Add the OpenInference span processor
        endpoint = self.traces_endpoint.unicode_string()
        exporter = OTLPSpanExporter(endpoint=endpoint)
        tracer_provider.add_span_processor(OpenInferenceSpanProcessor())
        tracer_provider.add_span_processor(SimpleSpanProcessor(exporter))

        Agent.instrument_all(instrument=InstrumentationSettings(version=2))
from pydantic_ai import ModelSettings
from pydantic_ai.models.openai import OpenAIChatModel
from pydantic_ai.providers.openai import OpenAIProvider

from pulse.configuration import cfg


def get_pydantic_ai_model() -> OpenAIChatModel:
    model_cls = OpenAIChatModel
    model = model_cls(
        cfg.llm.name,
        provider=OpenAIProvider(
            base_url=cfg.llm.api_base.unicode_string(),
            api_key=cfg.llm.api_key.get_secret_value(),
        ),
        settings=ModelSettings(extra_body=cfg.llm.extra_body),
    )
    return model
openapi-pydantic==0.5.1
openinference-instrumentation-pydantic-ai==0.1.9
opentelemetry-api==1.38.0
opentelemetry-exporter-otlp==1.38.0
opentelemetry-exporter-otlp-proto-common==1.38.0
opentelemetry-exporter-otlp-proto-grpc==1.38.0
opentelemetry-exporter-otlp-proto-http==1.38.0
opentelemetry-instrumentation==0.59b0
opentelemetry-instrumentation-httpx==0.59b0
opentelemetry-proto==1.38.0
opentelemetry-sdk==1.38.0
opentelemetry-semantic-conventions==0.59b0
opentelemetry-util-http==0.59b0
pydantic==2.12.4
pydantic-ai==1.11.0
pydantic-ai-slim==1.11.0
pydantic-core==2.41.5
pydantic-evals==1.11.0
pydantic-graph==1.11.0
pydantic-settings==2.11.0

@cephalization this is my instrumentation code for pydantic_ai, and my relevant pydantic/opentelemetry packages.

I'll note that actually, this error isn't 100% consistent - I can select some messages, especially earlier ones, and don't get the error. However, then switching to another span I get the error, and then the error appears on all spans.

The LLM is GLM-4.5-Air running on vLLM with a OpenAI compatible API

alex-dr avatar Nov 06 '25 15:11 alex-dr