[Help/Bug?] Getting Bouncer Disabled Error
Current Behavior
I'm looking for help getting this error corrected. I believe I have the crowdsec-openresty-bouncer.conf setup correctly.
nginx: [alert] [lua] crowdsec_openresty.conf:5):9: [Crowdsec] Bouncer Disabled
I believe I have the rest of the crowdsec processing NPM logs correctly.
ENABLED=true
API_URL=http://CROWDSECIP:8082
API_KEY=key-from-crowdsec
I can ping CROWDSECIP from the NPM container as well.
The crowdsec bouncer hasn't seen NPM trying to connect yet:
───────────────────────────────────────────────────────────────────────
Name IP Address Valid Last API pull Type Version Auth Type
───────────────────────────────────────────────────────────────────────
npm-proxy ✔️ api-key
───────────────────────────────────────────────────────────────────────
Expected Behavior
No response
Steps To Reproduce
No response
Environment
- OS: Unraid
- OS version: 7.0.0
- CPU:
- Docker version: 27.0.3
- Device model:
- Browser/OS:
Container creation
Unraid template
Container log
│ Docker Image Version: n/a │
│ Docker Image Platform: linux/amd64 │
│ │
╰――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――――╯
[cont-init ] 89-info.sh: terminated successfully.
[cont-init ] 99_crowdsec-openresty-bouncer.sh: executing...
[cont-init ] 99_crowdsec-openresty-bouncer.sh: Deploy Crowdsec Openresty Bouncer..
[cont-init ] 99_crowdsec-openresty-bouncer.sh: Patch crowdsec-openresty-bouncer.conf ..
[cont-init ] 99_crowdsec-openresty-bouncer.sh: Deploy Crowdsec Templates ..
[cont-init ] 99_crowdsec-openresty-bouncer.sh: terminated successfully.
[cont-init ] all container initialization scripts executed.
[init ] giving control to process supervisor.
[supervisor ] loading services...
[supervisor ] loading service 'default'...
[supervisor ] loading service 'app'...
[supervisor ] loading service 'nginx'...
[supervisor ] loading service 'logmonitor'...
[supervisor ] service 'logmonitor' is disabled.
[supervisor ] loading service 'logrotate'...
[supervisor ] service 'logrotate' is disabled.
[supervisor ] loading service 'cert_cleanup'...
[supervisor ] all services loaded.
[supervisor ] starting services...
[supervisor ] starting service 'nginx'...
[nginx ] nginx: [alert] [lua] crowdsec_openresty.conf:5):9: [Crowdsec] Bouncer Disabled
[supervisor ] starting service 'app'...
[app ] [3/1/2025] [6:46:46 AM] [Global ] › ℹ info Using Sqlite: /data/database.sqlite
[supervisor ] all services started.
[app ] [3/1/2025] [6:46:46 AM] [Migrate ] › ℹ info Current database version: none
[app ] [3/1/2025] [6:46:46 AM] [Setup ] › ℹ info Logrotate Timer initialized
[app ] [3/1/2025] [6:46:46 AM] [Global ] › ⬤ debug CMD: logrotate -s /config/logrotate.status /etc/logrotate.d/nginx-proxy-manager
[cert_cleanup] ----------------------------------------------------------
[cert_cleanup] Let's Encrypt certificates cleanup - 2025/03/01 06:46:46
[cert_cleanup] ----------------------------------------------------------
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-4/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-4/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-4/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-4/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-18/cert5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-18/fullchain5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-18/chain5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-18/privkey5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-3/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-3/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-3/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-3/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-9/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-9/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-9/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-9/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-7/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-7/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-7/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-7/cert6.pem.
[app ] [3/1/2025] [6:46:46 AM] [Setup ] › ℹ info Logrotate completed.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-8/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-8/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-8/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-8/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-2/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-2/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-2/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-2/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-19/chain5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-19/cert5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-19/privkey5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-19/fullchain5.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-5/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-5/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-5/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-5/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-1/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-1/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-1/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-1/cert6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-6/chain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-6/fullchain6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-6/privkey6.pem.
[cert_cleanup] Keeping /etc/letsencrypt/archive/npm-6/cert6.pem.
[cert_cleanup] 44 file(s) kept.
[cert_cleanup] 0 file(s) deleted.
[app ] [3/1/2025] [6:46:46 AM] [IP Ranges] › ℹ info Fetching IP Ranges from online services...
[app ] [3/1/2025] [6:46:46 AM] [IP Ranges] › ℹ info Fetching https://ip-ranges.amazonaws.com/ip-ranges.json
[app ] [3/1/2025] [6:46:47 AM] [IP Ranges] › ℹ info Fetching https://www.cloudflare.com/ips-v4
[app ] [3/1/2025] [6:46:47 AM] [IP Ranges] › ℹ info Fetching https://www.cloudflare.com/ips-v6
[app ] [3/1/2025] [6:46:47 AM] [SSL ] › ℹ info Let's Encrypt Renewal Timer initialized
[app ] [3/1/2025] [6:46:47 AM] [SSL ] › ℹ info Renewing SSL certs expiring within 30 days ...
[app ] [3/1/2025] [6:46:47 AM] [IP Ranges] › ℹ info IP Ranges Renewal Timer initialized
[app ] [3/1/2025] [6:46:47 AM] [Global ] › ℹ info Backend PID 449 listening on port 3000 ...
[app ] [3/1/2025] [6:46:47 AM] [SSL ] › ℹ info Completed SSL cert renew process
Container inspect
Anything else?
No response
Reinstalling the docker container I'm now getting a bit more info, but not sure how to correct it:
nginx: [error] [lua] config.lua:124: loadConfig(): unsupported configuration 'ENABLE_INTERNAL'
I use NPM and Crowdsec together in the same stack, each in a docker container. Initially, I configured against version v24.07.1, and everything worked perfectly.
Then I updated to version v24.12.1, the latest, and got exactly the same message / error as you did first.
[nginx ] nginx: [alert] [lua] crowdsec_openresty.conf:5):9: [Crowdsec] Bouncer Disabled
I then entered the container running NPM
docker exec -it nginx bash
and looked at the config from which the message came.
/etc/nginx/conf.d/crowdsec_openresty.conf
Okay, apparently "Bouncer Disabled" doesn't necessarily have to be the result of an error, but ...
I then looked at the bouncer-config again. In the location the config from just now references exactly to
/config/crowdsec/crowdsec-openresty-bouncer.conf
and looked at the lines carefully. For me, the entire configuration was repeated. First my old one including my API-Key and the Crowdsec URL and so on, then a new one, almost empty with default values - and it looked like this
ENABLED=true
.
.
.
ENABLED=false
The file is interpreted from top to bottom and keys / values will be overwritten after occurence, apparently. Delete everything from and including the second ENABLED to the end of that file, save and once the container is restarted
[nginx ] nginx: [alert] [lua] crowdsec_openresty.conf:5):11: [Crowdsec] Initialisation done
Everything is as it should be again. Maybe it's the same for you and that's it :)
Interesting, thanks for pointing this out ... this was likely the issue. I had removed the image and config files and switched to a vanilla NPM instance.
Switching back after your comment it's now working ... I don't have a copy of that old config, but it's working now!
Appreciate the response on this.
@malynel
How did you get the config to persist? Every time I restart the container it appends that new config. Tried changing to read only, but the startup sequence changes the permissions of the file so that didn't work.
@jamcalli
referenced in the volumes section of the compose file does not work?
volumes:
- ./nginx/config:/config
for me it persists
BUT:
while trouble shooting I changed the content of the file while being inside of the running container ... don't know if that makes a difference ... enter the running container and edit/config/crowdsec/crowdsec-openresty-bouncer.conf directly
My volumes were configured differently and that wouldn't work for me.
I had to do the following after docker exec -it containername
nano /etc/s6-overlay/s6-rc.d/cs-crowdsec-bouncer/script.sh
And then manually modify the startup script to not append that config.
Changed this:
if grep -vf /tmp/crowdsec.conf.raw /tmp/crowdsec-openresty-bouncer.conf.raw ; then
grep -vf /tmp/crowdsec.conf.raw /tmp/crowdsec-openresty-bouncer.conf.raw > /tmp/config.newvals
cp /data/crowdsec/crowdsec-openresty-bouncer.conf /data/crowdsec/crowdsec-openresty-bouncer.conf.bak
grep -f /tmp/config.newvals /defaults/crowdsec/crowdsec-openresty-bouncer.conf >> /data/crowdsec/crowdsec-openresty-bouncer.conf
fi
To this:
if grep -vf /tmp/crowdsec.conf.raw /tmp/crowdsec-openresty-bouncer.conf.raw ; then
# Create a backup of your current config
cp /data/crowdsec/crowdsec-openresty-bouncer.conf /data/crowdsec/crowdsec-openresty-bouncer.conf.bak
echo "Found new configuration keys, but not appending to prevent duplication"
# The problematic line below is commented out
# grep -f /tmp/config.newvals /defaults/crowdsec/crowdsec-openresty-bouncer.conf >> /data/crowdsec/crowdsec-openresty-bouncer.conf
fi
And then everything worked again after restarting.
I can also report of encountering this bug. Took me a while to identify it.
Same issue here.
Was able to fix by tweaking permissions on /config/crowdsec/crowdsec-openresty-bouncer.conf to read-only since the container isn't running as root.