grafana icon indicating copy to clipboard operation
grafana copied to clipboard

Datasource proxy headers broken since version 8.3

Open ThisIsQasim opened this issue 2 years ago • 16 comments

What happened: HTTP Headers are not being passed to the datasource since version 8.3.0

What you expected to happen: HTTP Headers to be passed to the datasource

How to reproduce it (as minimally and precisely as possible): Setup grafana 8.3.0 or higher behind a reverse proxy that injects some headers (or set dataproxy.send_user_header = true in config). Add a datasource and dump headers. Observe missing headers. Downgrade to 8.2.7 and headers appear again.

Anything else we need to know?:

Environment:

  • Grafana version: 8.3.0 =<
  • Data source type & version: Prometheus 2.32.1
  • OS Grafana is installed on: Docker
  • User OS & Browser: macOS & Safari
  • Grafana plugins: none

ThisIsQasim avatar Apr 13 '22 19:04 ThisIsQasim

Thanks for creating this issue, @ThisIsQasim Are you using a nginx? If so, how we handle headers changed in 8.3. Here is thread with the fix and the link to the changelog:

https://github.com/grafana/grafana/issues/45117#issuecomment-1033842787

zuchka avatar Apr 13 '22 21:04 zuchka

My setup is behind ingress-nginx so the host header is correctly set and everything in Grafana itself works as expected but the headers aren't being forwarded to the data source when grafana proxies to it.

ThisIsQasim avatar Apr 13 '22 21:04 ThisIsQasim

Your linked issue appears to be different. I haven't come across "origin not allowed" error and it's not related to 8.3.5 rather 8.3.0 and above.

ThisIsQasim avatar Apr 13 '22 22:04 ThisIsQasim

can you please provide more detailed steps for reproduction? Thank you!

zuchka avatar Apr 14 '22 04:04 zuchka

How to reproduce it (less minimal and precise):

  • Setup grafana 8.3.0 or higher behind a reverse proxy that injects some headers. Example for ingress-nginx
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  name: grafana
  labels:
    app: grafana
  annotations:
    kubernetes.io/ingress.class: "nginx"
    cert-manager.io/cluster-issuer: "letsencrypt"
spec:
  tls:
    - hosts:
        - monitor.example.com
      secretName: grafana-tls
  rules:
    - host: monitor.example.com
      http:
        paths:
          - path: /
            pathType: Prefix
            backend:
              service:
                name: grafana
                port: 
                  number: 3000
  • Alternatively you can set dataproxy.send_user_header = true in grafana config
# grafana.ini
[dataproxy]
logging = true
send_user_header = true
  • Add a datasource e.g. prometheus and dump incoming headers from the datasource. You can add any ip with an http server running as a datasource and observe the headers with tcpdump -i any -vvv -nnSS ip host <grafana_ip> and port <port specified in datasource>
  • Query the added datasource from grafana. You'll notice that the requests coming from grafana are missing headers that grafana itself is receiving
  • Downgrade grafana to 8.2.7 and headers appear again.

ThisIsQasim avatar Apr 14 '22 10:04 ThisIsQasim

This also affected us, was very puzzling for multiple days and an embarrassment for us. Downgrading to 8.2.7 fixed the issue.

mecampbellsoup avatar Jun 09 '22 15:06 mecampbellsoup

I was using this behavior to access Prometheus behind ALB. I expect x-amzn-oidc-data is passed to backend Prometheus query request. https://docs.aws.amazon.com/elasticloadbalancing/latest/application/listener-authenticate-users.html#user-claims-encoding

I think this issue is caused by Prometheus backend code change. https://github.com/grafana/grafana/blob/main/CHANGELOG.md#deprecations-4 The backend code is drastically changed recently. https://github.com/grafana/grafana/commits/v9.0.0/pkg/tsdb/prometheus

If Prometheus query request posted to /datasources/proxy/:id/*, https://github.com/grafana/grafana/blob/v8.2.7/pkg/api/api.go#L304-L305 And the request is processed here, header should be available. https://github.com/grafana/grafana/blob/v8.2.7/pkg/api/pluginproxy/ds_proxy.go

mtanda avatar Jul 07 '22 08:07 mtanda

@aocenas @marefr sorry for direct mention. I expect this issue is fixed by https://github.com/grafana/grafana/pull/51436 . But it seems not fixed yet. Did you notice behavior change. Is the change is intentional?

mtanda avatar Jul 07 '22 09:07 mtanda

@mtanda are you still encountering this?

zuchka avatar Jul 28 '22 08:07 zuchka

@zuchka Yes, the issue is not fixed at v9.0.5.

mtanda avatar Jul 28 '22 09:07 mtanda

Please please fix this

salanki avatar Sep 21 '22 01:09 salanki

Sorry, seems I didn't receive any notifications of this until now.

@ThisIsQasim @mtanda @salanki

  • are you using browser or server access mode for affected datasources?
  • is first problem that send_user_header = true is not forwarded from Grafana in outgoing datasource HTTP requests?
  • is the other problem incoming headers to Grafana not being forwarded from Grafana in outgoing datasource HTTP requests?
  • anything else to add?

marefr avatar Sep 21 '22 08:09 marefr

  • I am using a browser but the headers are being injected from a reverse proxy in front of Grafana
  • My understanding about send_user_header is that Grafana generates a user header and forwards it to the source. The issue is that the headers Grafana receives aren't being forwarded so I think that option is irrelevant.

Edit: (even though the original issue isn't about Grafana generated headers, it is also not working.)

ThisIsQasim avatar Sep 21 '22 09:09 ThisIsQasim

Thanks.

I am using a browser but the headers are being injected from a reverse proxy in front of Grafana

Sorry, meaning what Access you have configured for your datasource, Server or Browser? image

My understanding about send_user_header is that Grafana generates a user header and forwards it to the source. The issue is that the headers Grafana receives aren't being forwarded so I think that option is irrelevant.

send_user_header and incoming headers are being forwarded if requests going through the datasource proxy /api/datasources/proxy/<datasource id>/. For Prometheus that is no longer true when using Server access mode since it uses /api/ds/query (on recent versions at least). We might have missed supporting datasource proxy header logic for /api/ds/query and backend plugins in general I'm afraid.

marefr avatar Sep 21 '22 10:09 marefr

My bad, that is set to Server and yes I am using Prometheus as the data source.

Thank you for the detailed response. Do you think it could be added back in a future release?

ThisIsQasim avatar Sep 21 '22 11:09 ThisIsQasim

Seems like a feature users are dependent on so we probably need to add it back. We need to figure a few things out

  • How to support this for core/builtin plugins - probably the easiest part.
  • How to support this for external plugins in the https://github.com/grafana/grafana-plugin-sdk-go - the trickiest part.

Maybe we can split this work up. I'll come back when I have more information.

marefr avatar Sep 21 '22 18:09 marefr

I guess we can mention other issues https://github.com/grafana/grafana/issues/44464 and https://github.com/grafana/grafana/issues/41623 created earlier which duplicate this problem but which were left without attention

ve4eslav avatar Oct 13 '22 10:10 ve4eslav

This is really a show stopper in upgrading to grafana v9, for users of multitenant loki!

How to support this for core/builtin plugins - probably the easiest part.

The issue https://github.com/grafana/grafana/issues/54159#issuecomment-1231507046 have the probable way of fixing this. Would it be accepted as a MR?

sepich avatar Oct 14 '22 13:10 sepich

users of multitenant Loki

We recently changed the Loki datasource to forward headers properly from Grafana. https://github.com/grafana/grafana/pull/56896

I'm curious as to whether this is sufficient enough for Loki users? I realize it's not a solution for every single datasource, but that change might also potentially unblock a few users here.

This may not be the only blocker with this issue - if the headers we need aren't passed to the plugin layer in the first place, the change I linked won't be enough. But, it does at least make the Loki datasource propagate the headers that it's given, which is one step closer on this. If it works, then this could be used as precedent for other datasources.

alexweav avatar Oct 17 '22 18:10 alexweav

i'd personally recommend not relying on this Loki feature for now, it's not 100% clear yet what the best way forward with the datasource-sends-unrelated-headers is (see https://github.com/grafana/grafana/issues/57065). it could be that this is the way forward, but that should be an explicit decision from the plugins-platform-squad, before we use this as a precedent.

gabor avatar Oct 19 '22 11:10 gabor

We recently changed the Loki datasource to forward headers properly from Grafana. I'm curious as to whether this is sufficient enough for Loki users?

I've tested Version 9.3.0-85538pre (commit: 05ceff5188, branch: main) image version, and unfortunately it forwards neither User, nor OrgId having GF_DATAPROXY_SEND_USER_HEADER=true

sepich avatar Oct 26 '22 13:10 sepich

https://github.com/grafana/grafana/pull/58132 is targeted for v9.4.0 due to large refactoring needed and is blocking https://github.com/grafana/grafana/pull/58646, also targeted for v9.4.0.

marefr avatar Nov 28 '22 10:11 marefr