grafana-image-renderer icon indicating copy to clipboard operation
grafana-image-renderer copied to clipboard

Rendering dashboards timeouts in kubernetes (but rendering panels works fine)

Open wkopec-tt opened this issue 4 years ago • 8 comments

What happened: I am trying to obtain image of whole dashboard using image renderer in grafana instance set up in kubernetes (deployed with helm chart).

Rendering both panels and dashboards works fine with provided docker compose on local machine. Rendering single panel works, but attempts to render whole dashboard(with single panel) fail with timeout.

Locally: works(panel) : https://my-url/grafana/render/d-solo/gvjU0GIMz/testdashboardforrenderer?orgId=1&from=1596772156586&to=1596793756586&panelId=2&width=1000&height=500&tz=Europe%2FWarsaw works(dashboard): https://my-url/grafana/render/d/gvjU0GIMz/testdashboardforrenderer?orgId=1&from=1596772156586&to=1596793756586&panelId=2&width=1000&height=500&tz=Europe%2FWarsaw

On kubernetes: -works(panel): https://my-url/grafana/render/d-solo/gvjU0GIMz/testdashboardforrenderer?orgId=1&from=1596772156586&to=1596793756586&panelId=2&width=1000&height=500&tz=Europe%2FWarsaw -fails(dashboard): https://my-url/grafana/render/d/gvjU0GIMz/testdashboardforrenderer?orgId=1&from=1596772156586&to=1596793756586&panelId=2&width=1000&height=500&tz=Europe%2FWarsaw

Failure logs:

{"timeout":"60s","level":"debug","message":"Waiting for dashboard/panel to load"}

{"url":"/render?deviceScaleFactor=1.000000&domain=localhost&encoding=&height=500&renderKey=CgaBORr9GuDfERxlEDRE0CBgBkGCzWkT&timeout=60&timezone=Europe%2FWarsaw&url=http%3A%2F%2Flocalhost%3A3000%2Fd%2FgvjU0GIMz%2Ftestdashboardforrenderer%3ForgId%3D1%26from%3D1596772156586%26to%3D1596793756586%26panelId%3D2%26width%3D1000%26height%3D500%26tz%3DEurope%252FWarsaw%26render%3D1&width=1000","stack":"TimeoutError: waiting for function failed: timeout 60000ms exceeded\n    at new WaitTask (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:549:28)\n    at DOMWorld.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:454:12)\n    at Frame.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/FrameManager.js:657:28)\n    at Page.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/Page.js:1144:29)\n    at Browser.<anonymous> (/usr/src/app/build/browser/browser.js:174:24)\n    at Generator.next (<anonymous>)\n    at fulfilled (/usr/src/app/build/browser/browser.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:97:5)","level":"error","message":"Request failed"}

{"url":"http://localhost:3000/d/gvjU0GIMz/testdashboardforrenderer?orgId=1&from=1596772156586&to=1596793756586&panelId=2&width=1000&height=500&tz=Europe%2FWarsaw&render=1","level":"debug","message":"Connection closed"}

{"message":"127.0.0.1 - - [07/Aug/2020:09:52:28 +0000] \"GET /render?deviceScaleFactor=1.000000&domain=localhost&encoding=&height=500&renderKey=JTXWVv7qlBQ530L7PAevNgNDDjf1s0AE&timeout=60&timezone=Europe%2FWarsaw&url=http%3A%2F%2Flocalhost%3A3000%2Fd%2FgvjU0GIMz%2Ftestdashboardforrenderer%3ForgId%3D1%26from%3D1596772156586%26to%3D1596793756586%26panelId%3D2%26width%3D1000%26height%3D500%26tz%3DEurope%252FWarsaw%26render%3D1&width=1000 HTTP/1.1\" - - \"-\" \"Grafana/7.1.3\"\n","level":"debug"}

{"url":"/render?deviceScaleFactor=1.000000&domain=localhost&encoding=&height=500&renderKey=JTXWVv7qlBQ530L7PAevNgNDDjf1s0AE&timeout=60&timezone=Europe%2FWarsaw&url=http%3A%2F%2Flocalhost%3A3000%2Fd%2FgvjU0GIMz%2Ftestdashboardforrenderer%3ForgId%3D1%26from%3D1596772156586%26to%3D1596793756586%26panelId%3D2%26width%3D1000%26height%3D500%26tz%3DEurope%252FWarsaw%26render%3D1&width=1000","stack":"TimeoutError: waiting for function failed: timeout 60000ms exceeded\n    at new WaitTask (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:549:28)\n    at DOMWorld.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:454:12)\n    at Frame.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/FrameManager.js:657:28)\n    at Page.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/Page.js:1144:29)\n    at Browser.<anonymous> (/usr/src/app/build/browser/browser.js:174:24)\n    at Generator.next (<anonymous>)\n    at fulfilled (/usr/src/app/build/browser/browser.js:5:58)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (internal/process/task_queues.js:97:5)","level":"error","message":"Request failed"}

Result are the same in both configurations:

  • image render as separate container
  • custom grafana image with image renderer plugin included (built accoriding to https://grafana.com/docs/grafana/latest/installation/docker/#build-with-grafana-image-renderer-plugin-pre-installed)

What you expected to happen: Dashboard image can be rendered in kubernetes

How to reproduce it (as minimally and precisely as possible):

  1. Run grafana using helm chart:https://github.com/helm/charts/tree/master/stable/grafana and values:
image:
  tag: latest

extraContainers: |
  - name: image-render
    image: grafana/grafana-image-renderer:latest
    ports:
      - containerPort: 8081
    env:
      - name: LOG_LEVEL
        value: debug
      - name: HTTP_HOST
        value: localhost
      - name: RENDERING_VERBOSE_LOGGING
        value: "true"

envRenderSecret:
  GF_RENDERING_SERVER_URL: http://localhost:8081/render
  GF_RENDERING_CALLBACK_URL: http://localhost:3000
  GF_LOG_FILTERS: rendering:debug
  1. Create dashboard with example chart, save it.
  2. Render panel's image (panel -> share -> image), it should work.
  3. Change render/d-solo do render/d to render whole panel's image - it fails.

Anything else we need to know?:

Environment:

  • Grafana Image Renderer version: 2.0.0
  • Grafana version: 7.1.3
  • Installed plugin or remote renderer service: remote render service (separate container in docker, behaviour is identical with custom grafana image with renderer plugin included)
  • OS Grafana Image Renderer is installed on: Kubernetes
  • User OS & Browser: Ubuntu grafana-image-renderer, Firefox 79.0
  • Others:

wkopec-tt avatar Aug 07 '20 13:08 wkopec-tt

I'm trying to run the image renderer in a similar way, as a sidecar to the main Grafana kubernetes container, and getting the same timeout error.

{"url":"/render?deviceScaleFactor=1.000000&domain=localhost&encoding=&height=500&renderKey=Vfuz3q6oFS3jkkZvTSfhs1LT3z7kTRl8&timeout=60&timezone=America%2FLos_Angeles&url=http%3A%2F%2Flocalhost%3A3000%2Fdashboard-solo%2Fnew%3ForgId%3D1%26from%3D1600785871283%26to%3D1600807471283%26panelId%3D2%26width%3D1000%26height%3D500%26tz%3DAmerica%252FLos_Angeles%26render%3D1&width=1000","stack":"TimeoutError: waiting for function failed: timeout 60000ms exceeded\n at new WaitTask (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:549:28)\n at DOMWorld.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/DOMWorld.js:454:12)\n at Frame.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/FrameManager.js:657:28)\n at Page.waitForFunction (/usr/src/app/node_modules/puppeteer/lib/Page.js:1144:29)\n at Browser.<anonymous> (/usr/src/app/build/browser/browser.js:174:24)\n at Generator.next (<anonymous>)\n at fulfilled (/usr/src/app/build/browser/browser.js:5:58)\n at processTicksAndRejections (internal/process/task_queues.js:97:5)","level":"error","message":"Request failed"}

keithhardaway avatar Sep 22 '20 21:09 keithhardaway

I'm in the same situation. I'm running render image as a grafana extracontainer in kube-prometheus helm chart and I have the same timeout

scegliau avatar Oct 13 '20 13:10 scegliau

Anybody able to resolve this issue? Seeing the same

sumantmunjalyahoo avatar Nov 03 '20 22:11 sumantmunjalyahoo

Check your callback_url, I'm guessing your grafana service doesn't run on port 3000, but rather on port 80. I changed the GF_RENDERING_CALLBACK_URL to http://grafana:80/ and it started to work. Also, you can try and increase the GF_ALERTING_NOTIFICATION_TIMEOUT_SECONDS timout

aberenshtein avatar Nov 08 '20 15:11 aberenshtein

@aberenshtein thank you for a suggestion - unfortunately this does not solve the problem

wkopec-tt avatar Nov 23 '20 09:11 wkopec-tt

We're experience the same problem

@aberenshtein Setting GF_ALERTING_NOTIFICATION_TIMEOUT_SECONDS does not seem to affect the timeout parameter passed. I've rechecked the pod that the env variable is set, but the render call still has timeout=60.

nvollmar avatar Jan 22 '21 11:01 nvollmar

After some digging I found the issue to render dashboards on Kubernetes was the GF_SERVER_ROOT_URL we set for external access. Removing that fixed the issue and allows to render complete dashboards.

To be able to render images we're using an nginx reverse proxy to provide the same subpath as external. For that GF_SERVER_SERVE_FROM_SUB_PATH has to be enabled.

nvollmar avatar Jan 25 '21 10:01 nvollmar

Facing the exact same issue. Grafana version 7.4.0, grafana/grafana-image-renderer:2.1.1

vikrant6 avatar Jul 28 '21 23:07 vikrant6