Multiple slack Recievers go to the same channel
I have setup Alertmanager with multiple slack recievers. I was hoping to have the alerts get sent to two different slack channels that are configured with the "Incoming Webhook" applicaiton. However, for some reason Alertmanager seems to ignore the api_url in the slack_configs section and it will instead use the URL from the global.slack_api_url instead. I am not even sure why global.slack_api_url is required if we can configure the api_url directly in the slack_configs.
This is what my config looked like:
config:
global:
slack_api_url: "https://hooks.slack.com/services/blah1/blah1/blah1"
route:
receiver: "null"
routes:
- receiver: "slack-notifications"
continue: true # Continue to the next route
- matchers:
- alertname != "Watchdog"
group_by: [ 'cluster' ]
group_interval: 24h # Only re-check the group for alerts every day.
repeat_interval: 48h # Repeat sending alerts every 2 days
receiver: "spm-slack-notifications"
receivers:
- name: 'null' # Blackhole receiver
- name: 'spm-slack-notifications'
slack_configs:
- api_url: "https://hooks.slack.com/services/blah1/blah1/blah1"
channel: '#spm-testbed-status'
send_resolved: true
- name: 'slack-notifications'
slack_configs:
- api_url: "https://hooks.slack.com/services/blah1/blah2/blah2"
channel: '#some-other-channel'
send_resolved: true
But for some reason, both alerts are getting sent to the same channel:
It appears that both recievers are using the global.slack_api_url to send the notifications instead of using their individual slack_configs.api_url. This seems like a bug to me.
Running: alertmanager:v0.28.0
According to the config code, the global slack_api_url is only used when the url is not defined in the receiver. You should not need the global setting at all. It looks like you only set the channel on one of the receivers. I would try to set it on both of them. I'm not familiar with the slack webhook, but could it be defaulting to the same channel?
Unfortunately not the case. You can see my config posted above where I am explicitly setting both api_url params in the slack_configs block. I also thought the global slack API config was unnessecary but oddly enough, if I don't set the global.slack_api_url at all, Alertmanager will throw an exception, won't start, and tells me that it's required.
And I am actually setting the channel in both of my configs, I just didn't copy that to my config in the original post. I have updated that
Maybe this has been fixed since v0.28.0?
Edit: Nevermind, seems like 0.28.1 is the latest version, so unlikely
I think the bug must be here:
if sc.APIURL == nil && len(sc.APIURLFile) == 0 {
if c.Global.SlackAPIURL == nil && len(c.Global.SlackAPIURLFile) == 0 {
return errors.New("no global Slack API URL set either inline or in a file")
}
sc.APIURL = c.Global.SlackAPIURL
sc.APIURLFile = c.Global.SlackAPIURLFile
}
If I don't set the global.slack_api_url, I am hitting this error: "no global Slack API URL set either inline or in a file" which is in the snippet above. And if I do set it, then my recievers are having their sc.APIURL all set to the same global as you can see in this snippet above as well.
For whatever reason, even if my SlackConfigs has the api_url set, it seems that sc.APIURL == nil is true. It seems sc.APIURL is not getting set properly and it always null.
The config appears to be loaded here, and this looks correct to me?
Oddly enough there is a unit test that tests this case using this config file and it works.
So I am not exactly sure why my configuration (which looks identical to the one in the test here), has it's APIURL nil...
Maybe this is user-error or a problem with how the kube-prometheus-stack helm chart applies the configmap. I will have to test this some more.
I think something else is going on here. I took your original config but deleted the global section and ran it against v0.28.0 with no errors. You mention kube-prometheus-stack applying configmaps and that sounds like a good place to look. Can you run a shell on the alertmanager pod and dump the config file?