openinference icon indicating copy to clipboard operation
openinference copied to clipboard

[BUG] Prompts are noy displayed correctly

Open kripper opened this issue 11 months ago • 6 comments

image Is it possible to view the prompts as Markdown instead of escaped JSON strings, which are hard on the eyes?

This is a trace of "vertex_ai/gemini-2.0-flash-exp" via the LiteLLM OTEL callback.

kripper avatar Dec 21 '24 02:12 kripper

Hey @kripper - thanks for your note - it looks like you might be using openLLMetry which has different conventions around llm semantic conventions. We are participating in the genai conventions group but currently don't support LiteLLM instrumentation from traceloop.

If you'd like to try our LiteLLM instrumentation please check out the openinference intsrumentation https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-litellm

Thanks for your feedback. We're hoping we can land on a good set of conventions that all backends support!

mikeldking avatar Dec 27 '24 18:12 mikeldking

Thanks @mikeldking. I'm a little confused. Does that mean that we can use this setting for Phoenix instead?

litellm_settings:
  callbacks: ["arize"]

environment_variables:
    ARIZE_SPACE_KEY: "default"
    #ARIZE_API_KEY: ""
    ARIZE_ENDPOINT: "http://127.0.0.1:6006/v1" <--- Is this endpoint correct?
    ARIZE_HTTP_ENDPOINT: "http://127.0.0.1:6006/v1"

With these settings, I'm getting this error:

Traceback (most recent call last):
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/sdk/trace/export/__init__.py", line 360, in _export_batch
    self.span_exporter.export(self.spans_list[:idx])  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 189, in export
    return self._export_serialized_spans(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 159, in _export_serialized_spans
    resp = self._export(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 133, in _export
    return self._session.post(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 575, in request
    prep = self.prepare_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 484, in prepare_request
    p.prepare(
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 367, in prepare
    self.prepare_url(url, params)
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 438, in prepare_url
    raise MissingSchema(
requests.exceptions.MissingSchema: Invalid URL 'None': No scheme supplied. Perhaps you meant https://None?

kripper avatar Jan 08 '25 03:01 kripper

Thanks @mikeldking. I'm a little confused. Does that mean that we can use this setting for Phoenix instead?

litellm_settings:
  callbacks: ["arize"]

environment_variables:
    ARIZE_SPACE_KEY: "default"
    #ARIZE_API_KEY: ""
    ARIZE_ENDPOINT: "http://127.0.0.1:6006/v1" <--- Is this endpoint correct?
    ARIZE_HTTP_ENDPOINT: "http://127.0.0.1:6006/v1"

With these settings, I'm getting this error:

Traceback (most recent call last):
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/sdk/trace/export/__init__.py", line 360, in _export_batch
    self.span_exporter.export(self.spans_list[:idx])  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 189, in export
    return self._export_serialized_spans(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 159, in _export_serialized_spans
    resp = self._export(serialized_data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/python/3.12.1/lib/python3.12/site-packages/opentelemetry/exporter/otlp/proto/http/trace_exporter/__init__.py", line 133, in _export
    return self._session.post(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 637, in post
    return self.request("POST", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 575, in request
    prep = self.prepare_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/sessions.py", line 484, in prepare_request
    p.prepare(
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 367, in prepare
    self.prepare_url(url, params)
  File "/home/codespace/.local/lib/python3.12/site-packages/requests/models.py", line 438, in prepare_url
    raise MissingSchema(
requests.exceptions.MissingSchema: Invalid URL 'None': No scheme supplied. Perhaps you meant https://None?

Unfortunately not - that integration was added by the litellm team and it only can export telemetry to https://app.arize.com/ and not phoenix.

Let me dig into their proxy integration a bit more (https://github.com/BerriAI/liteLLM-proxy/issues/17)

In the meantime you can use any of our integrations found here on the client side (https://docs.arize.com/phoenix/tracing/integrations-tracing) that might be the easiest unblock.

Thank you for your patience.

mikeldking avatar Jan 08 '25 05:01 mikeldking

In the meantime you can use any of our integrations found here on the client side

I would like to, but I'm afraid we are dealing with sensitive data in the prompts that cannot be exposed online.

kripper avatar Jan 09 '25 04:01 kripper

In the meantime you can use any of our integrations found here on the client side

I would like to, but I'm afraid we are dealing with sensitive data in the prompts that cannot be exposed online.

Interesting. Will do. Well look into it.

Cc @nate-mar

mikeldking avatar Jan 09 '25 05:01 mikeldking

Hey @kripper ! Just wanted to give you a heads up that the support for using Phoenix with LiteLLM router should be coming out soon in the next release of LiteLLM. I'll ping here again once it's out. Thanks for your patience on this! 😃

nate-mar avatar Feb 25 '25 06:02 nate-mar

The router support should be in. If you face any issues on the router side definitely let the litellm team know. We can coordinate a fix with them

mikeldking avatar Jun 13 '25 23:06 mikeldking