Cannot send notification when monitor is created with AutoKuma
When creating a monitor with AutoKuma, the following error shows in the Uptime Kuma log when a notification should be sent:
2024-08-21T12:57:55+02:00 [MONITOR] ERROR: Cannot send notification to ntfy
Error: Error: AxiosError: Request failed with status code 400 {"code":40018,"http":400,"error":"invalid request: actions invalid; parameter 'url' is required for action 'view'","link":"https://ntfy.sh/docs/publish/#action-buttons"}
at Ntfy.throwGeneralAxiosError (/app/server/notification-providers/notification-provider.js:38:15)
at Ntfy.send (/app/server/notification-providers/ntfy.js:72:18)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Monitor.sendNotification (/app/server/model/monitor.js:1427:21)
at async beat (/app/server/model/monitor.js:964:21)
at async Timeout.safeBeat [as _onTimeout] (/app/server/model/monitor.js:1032:17)
When creating the same monitor manually in Uptime Kuma, it works and a notification is sent, so I assume it's a problem with AutoKuma, although I could be wrong. I also assumed all AutoKuma is doing is turn the notification on for all new monitors but somehow it works when creating manually vs when created with AutoKuma.
The following is my docker-compose: I copied and adjusted a snippet from another issue.
services:
uptime-kuma:
image: louislam/uptime-kuma:1.23.13
container_name: uptime-kuma
# ports:
# - 3001:3001 # <Host Port>:<Container Port>
restart: unless-stopped
environment:
- PUID=1000
- PGID=998
volumes:
- ./uptime-kuma-data:/app/data
- /var/run/docker.sock:/var/run/docker.sock
networks:
- uptime-kuma
labels:
traefik.enable: true
traefik.http.routers.uptime-kuma.middlewares: authentik@file
kuma.__web: '{ "name": "Uptime Kuma", "service": "Uptime Kuma", "type": "web-group" }'
autokuma:
image: ghcr.io/bigboot/autokuma:0.7.0
container_name: autokuma
restart: unless-stopped
environment:
AUTOKUMA__KUMA__URL: http://uptime-kuma:3001
# AUTOKUMA__KUMA__USERNAME: <username>
# AUTOKUMA__KUMA__PASSWORD: <password>
# AUTOKUMA__KUMA__MFA_TOKEN: <token>
# AUTOKUMA__KUMA__HEADERS: "<header1_key>=<header1_value>,<header2_key>=<header2_value>,..."
AUTOKUMA__KUMA__CALL_TIMEOUT: 5
AUTOKUMA__KUMA__CONNECT_TIMEOUT: 5
AUTOKUMA__TAG_NAME: AutoKuma
AUTOKUMA__TAG_COLOR: "#42C0FB"
AUTOKUMA__DEFAULT_SETTINGS: |-
docker.docker_container: {{container_name}}
http.max_redirects: 10
*.max_retries: 3
*.notification_id_list: { "1": true }
AUTOKUMA__SNIPPETS__WEB: |-
{# Assign the first snippet arg for readability #}
{% set args = args[0] %}
{# Generate IDs with slugify /#}
{% set id = args.name | slugify %}
{% if args.service %}
{% set service_id = args.service | slugify %}
{% endif %}
{# Define the top level services/app naming conventions #}
{% if args.type == "web" or args.type == "solo" %}
{{ id }}-group.group.name: {{ args.name }}
{% elif args.type == "web-group" %}
{{ id }}-group.group.name: {{ args.name }}
{{ id }}-svc-group.group.parent_name: {{ id }}-group
{{ id }}-svc-group.group.name: {{ args.name }} App
{% elif service_id is defined and args.type in ["redis", "mysql", "postgres", "support", "web-support"] %}
{{ id }}-svc-group.group.parent_name: {{ service_id }}-group
{{ id }}-svc-group.group.name: {{ args.name }}{% if args.type in ["support", "web-support"] %} App{% endif %}
{% endif %}
{# Web containers get https checks #}
{% if args.type == "web-group" or args.type == "web" or args.type == "web-support" %}
{% if args.type == "web" %}
{% set parent = id ~ "-group" %}
{% else %}
{% set parent = id ~ "-svc-group" %}
{% endif %}
{{ id }}-https.http.parent_name: {{ parent }}
{{ id }}-https.http.name: {{ args.name }} (Web)
{{ id }}-https.http.url: https://{{ container_name }}.domain.tdl
{% endif %}
{# Database containers get db specific checks #}
{% if args.type in ["redis", "mysql", "postgres"] %}
{{ id }}-db.{{ args.type }}.name: {{ args.name }} (DB)
{{ id }}-db.{{ args.type }}.parent_name: {{ id }}-svc-group
{{ id }}-db.{{ args.type }}.database_connection_string: {{ args.db_url }}
{% endif %}
{# All containers get a container check #}
{{ id }}-container.docker.name: {{ args.name }} (Container)
{% if args.type == "web" or args.type == "solo" %}
{% set parent_name = id ~ "-group" %}
{% else %}
{% set parent_name = id ~ "-svc-group" %}
{% endif %}
{{ id }}-container.docker.parent_name: {{ parent_name }}
{{ id }}-container.docker.docker_container: {{ container_name }}
{{ id }}-container.docker.docker_host: 1
{# Database containers get db specific checks /#}
{% if args.type in ["redis", "mysql", "postgres"] %}
{{ id }}-db.{{ args.type }}.name: {{ args.name }} (DB)
{{ id }}-db.{{ args.type }}.parent_name: {{ id }}-svc-group
{{ id }}-db.{{ args.type }}.database_connection_string: {{ args.db_url }}
{% endif %}
{# All containers get a container check /#}
{{ id }}-container.docker.name: {{ args.name }} (Container)
{% if args.type == "web" or args.type == "solo" %}
{% set parent_name = id ~ "-group" %}
{% else %}
{% set parent_name = id ~ "-svc-group" %}
{% endif %}
{{ id }}-container.docker.parent_name: {{ parent_name }}
{{ id }}-container.docker.docker_container: {{ container_name }}
{{ id }}-container.docker.docker_host: 1
AUTOKUMA__DOCKER__HOSTS: unix:///var/run/docker.sock
AUTOKUMA__DOCKER__LABEL_PREFIX: kuma
# AUTOKUMA__STATIC__MONITORS:
AUTOKUMA__LOG_DIR: /logs
AUTOKUMA__ON_DELETE: delete
# AUTOKUMA__DOCKER__SOURCE:
# AUTOKUMA__DOCKER__TLS__VERIFY:
# AUTOKUMA__DOCKER__TLS__CERT:
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./autokuma/logs:/logs
networks:
- uptime-kuma
depends_on:
- uptime-kuma
labels:
kuma.__web: '{ "name": "AutoKuma", "service": "Uptime Kuma", "type": "support" }'
networks:
uptime-kuma:
name: uptime-kuma
Thanks for any help!
Seems to be https://github.com/louislam/uptime-kuma/issues/3274 right?
Well the weird thing is, if I create the same monitor manually in uptime kuma, I receive a notification, so notification settings seem to work. Admittedly, the action button opens an empty tab, but that's not the problem here. The problem is, I don't receive any notification at all if it's created through AutoKuma. It's like there's some hidden difference between a monitor created manually and one created through uptime kuma, which causes this error.
I'd assume that's because in the uptime-kuma UI the "url" parameter is pre-filled with "https://" even for monitor types where a url doesn't make sense, that's enough to satisfy the ntfy api.
You could just replicate this in the DEFAULT_SETTINGS
Alright I think I understood the problem. I added *.url: https:// to the DEFAULT_SETTINGS. But it doesn't get added to the monitor when using type docker, I assume because the monitor type docker doesn't have the property "url".
I've added a workaround to mimic the behaviour of the webui, this should allow it to work with stable UptimeKuma until the mentioned fix is rolled out on their side.
Wow that was quick and could already test it out with the new version. Works as expected, all notifications get sent. Thanks!