phoenix
phoenix copied to clipboard
[BUG] Error when using nginx ?
Describe the bug I'm running llama-index and phoenix on a container, and I want to use Nginx for reverse proxying, to proxy the container's output port 6006, to localhost:8080 inside the container. My Nginx configuration file is as follows:
server {
listen 6006;
location /service1/ {
proxy_pass http://localhost:8080/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
The external IP:port mapping for my container's port 6006 is https://u184955-****-********.westc.gpuhub.com:8443/. I used this code to start phoenix.
import phoenix as px
from llama_index.core import set_global_handler
import os
os.environ["PHOENIX_PORT"] = "8080"
session = px.launch_app()
set_global_handler("arize_phoenix")
When I opened the url "http://localhost:8080/", phoenix started successfully. However, when I opened the url "https://u184955-****-********.westc.gpuhub.com:8443/service1/", the website was white all.
Screenshots
Environment (please complete the following information):
- phoenix version : 3.16.2
- Browser Edge
Hey @kdy0912 glad to see you here!
It's a bit difficult for me to fully grasp the ngrok + nginx config but my instinct is to think something is a bit off there.
Can I suggest an alternative? You actually don't need to run phoenix with your application, you can actually run it as a side car. Here's an example. You would simply use OpenInference instead of Phoenix to export the traces to the phoenix container. Here's a working example with llama-index. https://github.com/Arize-ai/openinference/tree/main/python/examples/llama-index
The migration looks something like this.
Hope that helps