trouble scanning other subnets
Did I research?
- [x] I have searched the docs https://jokob-sk.github.io/NetAlertX/
- [x] I have searched the existing open and closed issues
- [x] I confirm my SCAN_SUBNETS is configured and tested as per https://github.com/jokob-sk/NetAlertX/blob/main/docs/SUBNETS.md
The issue occurs in the following browsers. Select at least 2.
- [ ] Firefox
- [x] Chrome
- [ ] Other (unsupported) - PRs welcome
- [ ] N/A - This is an issue with the backend
What I want to do
Scan all subnets in 192.168.0.0/16
The arp command works fine sudo arp-scan --interface=eth0 192.168.0.0/16
However, app is only showing .1.0 subnet
Relevant settings you changed
192.168.0.0/16 --interface=eth1
docker-compose.yml
❯ cat docker-compose.yml
"cat" -> "ccat"
services:
netalertx:
privileged: true
build:
dockerfile: Dockerfile
context: .
cache_from:
- type=registry,ref=docker.io/jokob-sk/netalertx:buildcache
container_name: netalertx
network_mode: host
# restart: unless-stopped
volumes:
# - ${APP_DATA_LOCATION}/netalertx_dev/config:/app/config
- ${APP_DATA_LOCATION}/netalertx/config:/app/config
# - ${APP_DATA_LOCATION}/netalertx_dev/db:/app/db
- ${APP_DATA_LOCATION}/netalertx/db:/app/db
# (optional) useful for debugging if you have issues setting up the container
- ${APP_DATA_LOCATION}/netalertx/log:/app/log
# (API: OPTION 1) use for performance
- type: tmpfs
target: /app/api
# (API: OPTION 2) use when debugging issues
# - ${DEV_LOCATION}/api:/app/api
# ---------------------------------------------------------------------------
# DELETE START anyone trying to use this file: comment out / delete BELOW lines, they are only for development purposes
- ${APP_DATA_LOCATION}/netalertx/dhcp_samples/dhcp1.leases:/mnt/dhcp1.leases
- ${APP_DATA_LOCATION}/netalertx/dhcp_samples/dhcp2.leases:/mnt/dhcp2.leases
- ${APP_DATA_LOCATION}/netalertx/dhcp_samples/pihole_dhcp_full.leases:/etc/pihole/dhcp.leases
- ${APP_DATA_LOCATION}/netalertx/dhcp_samples/pihole_dhcp_2.leases:/etc/pihole/dhcp2.leases
- ${APP_DATA_LOCATION}/pihole/etc-pihole/pihole-FTL.db:/etc/pihole/pihole-FTL.db
- ${DEV_LOCATION}/mkdocs.yml:/app/mkdocs.yml
- ${DEV_LOCATION}/docs:/app/docs
- ${DEV_LOCATION}/server:/app/server
- ${DEV_LOCATION}/test:/app/test
- ${DEV_LOCATION}/dockerfiles:/app/dockerfiles
# - ${APP_DATA_LOCATION}/netalertx/php.ini:/etc/php/8.2/fpm/php.ini
- ${DEV_LOCATION}/install:/app/install
- ${DEV_LOCATION}/front/css:/app/front/css
- ${DEV_LOCATION}/front/img:/app/front/img
- ${DEV_LOCATION}/back/update_vendors.sh:/app/back/update_vendors.sh
- ${DEV_LOCATION}/front/lib:/app/front/lib
- ${DEV_LOCATION}/front/js:/app/front/js
- ${DEV_LOCATION}/front/php:/app/front/php
- ${DEV_LOCATION}/front/deviceDetails.php:/app/front/deviceDetails.php
- ${DEV_LOCATION}/front/deviceDetailsEdit.php:/app/front/deviceDetailsEdit.php
- ${DEV_LOCATION}/front/userNotifications.php:/app/front/userNotifications.php
- ${DEV_LOCATION}/front/deviceDetailsTools.php:/app/front/deviceDetailsTools.php
- ${DEV_LOCATION}/front/deviceDetailsPresence.php:/app/front/deviceDetailsPresence.php
- ${DEV_LOCATION}/front/deviceDetailsSessions.php:/app/front/deviceDetailsSessions.php
- ${DEV_LOCATION}/front/deviceDetailsEvents.php:/app/front/deviceDetailsEvents.php
- ${DEV_LOCATION}/front/devices.php:/app/front/devices.php
- ${DEV_LOCATION}/front/events.php:/app/front/events.php
- ${DEV_LOCATION}/front/plugins.php:/app/front/plugins.php
- ${DEV_LOCATION}/front/pluginsCore.php:/app/front/pluginsCore.php
- ${DEV_LOCATION}/front/index.php:/app/front/index.php
- ${DEV_LOCATION}/front/maintenance.php:/app/front/maintenance.php
- ${DEV_LOCATION}/front/network.php:/app/front/network.php
- ${DEV_LOCATION}/front/presence.php:/app/front/presence.php
- ${DEV_LOCATION}/front/settings.php:/app/front/settings.php
- ${DEV_LOCATION}/front/systeminfo.php:/app/front/systeminfo.php
- ${DEV_LOCATION}/front/cloud_services.php:/app/front/cloud_services.php
- ${DEV_LOCATION}/front/report.php:/app/front/report.php
- ${DEV_LOCATION}/front/workflows.php:/app/front/workflows.php
- ${DEV_LOCATION}/front/appEventsCore.php:/app/front/appEventsCore.php
- ${DEV_LOCATION}/front/multiEditCore.php:/app/front/multiEditCore.php
- ${DEV_LOCATION}/front/plugins:/app/front/plugins
# DELETE END anyone trying to use this file: comment out / delete ABOVE lines, they are only for development purposes
# ---------------------------------------------------------------------------
environment:
# - APP_CONF_OVERRIDE={"SCAN_SUBNETS":"['192.168.1.0/24 --interface=eth1']","GRAPHQL_PORT":"20223","UI_theme":"Light"}
- TZ=${TZ}
- PORT=${PORT}
# ❗ DANGER ZONE BELOW - Setting ALWAYS_FRESH_INSTALL=true will delete the content of the /db & /config folders
- ALWAYS_FRESH_INSTALL=${ALWAYS_FRESH_INSTALL}
# - LOADED_PLUGINS=["DHCPLSS","PIHOLE","ASUSWRT","FREEBOX"]
What installation are you running?
Production (netalertx)
app.log
❯ tail -100 /app/log/app.log
19:11:59 [AVAHISCAN] In script
19:11:59 [Database] Opening DB
19:11:59 [AVAHISCAN] Unknown devices count: 1
19:11:59 [AVAHISCAN] Attempt 1 - Ensuring D-Bus and Avahi daemon are running...
19:11:59 [AVAHISCAN] ⚠ ERROR - Failed to add Avahi to runlevel: None
19:11:59 [AVAHISCAN] DEBUG CMD :['avahi-resolve', '-a', '192.168.1.228']
19:12:04 [AVAHISCAN] DEBUG OUTPUT : Failed to resolve address '192.168.1.228': Timeout reached
19:12:04 [AVAHISCAN] Domain Name: to
19:12:04 [AVAHISCAN] Script finished
19:12:04 [Plugins] Processed and deleted file: /app/log/plugins/last_result.AVAHISCAN.log
19:12:04 [Plugins] No output received from the plugin "AVAHISCAN"
19:12:04 [Plugin utils] ---------------------------------------------
19:12:04 [Plugin utils] display_name: NSLOOKUP (Name discovery)
19:12:04 [Plugins] Executing: python3 /app/front/plugins/nslookup_scan/nslookup.py
19:12:05 [Plugins] Output: [plugin_helper] reading config file
19:12:05 [NSLOOKUP] In script
19:12:05 [Database] Opening DB
19:12:05 [NSLOOKUP] Unknown devices count: 1
19:12:05 [NSLOOKUP]No PTR record found for IP: 192.168.1.228
19:12:05 [NSLOOKUP] Script finished
19:12:05 [Plugins] Processed and deleted file: /app/log/plugins/last_result.NSLOOKUP.log
19:12:05 [Plugins] No output received from the plugin "NSLOOKUP"
19:12:05 [Update Device Name] Trying to resolve devices without name. Unknown devices count: 1
19:12:05 [Update Device Name] Names Found (DiG/mDNS/NSLOOKUP/NBTSCAN): 0 (0/0/0/0)
19:12:05 [Update Device Name] Names Not Found : 1
19:12:05 [Notification] Check if something to report
19:12:05 [Notification] Included sections: ['new_devices', 'down_devices', 'events']
19:12:05 [Notification] No changes to report
19:12:05 [MAIN] Process: Idle
Debug enabled
- [x] I have read and followed the steps in the wiki link above and provided the required debug logs and the log section covers the time when the issue occurs.
Ive tried 192.168.0.0.0/16 --interface=eth0
but also
192.168.21.0/24 --interface=eth0 192.168.24.0/24 --interface=eth0 192.168.42.0/24 --interface=eth0
No matter what subnets I do I seem to only be able to capture .1.0 devices
Hi, I assume the issue is resolved as you closed it?
This looks incorrect 192.168.0.0.0/16
If you increase the mask size you need to increase delays betweens scans as well.
Please make sure you read the following documentation
https://jokob-sk.github.io/NetAlertX/REMOTE_NETWORKS/?h=remo
Sorry, was a copy paste typo I meant 192.168.0.0.0/16 --interface=eth0 This should work right?
Hi,
I don't think so. I think the correct notation is
192.168.0.0/16 --interface=eth0
As per docs (https://jokob-sk.github.io/NetAlertX/SUBNETS/), you can verify this by running the following command in the container (replace the interface and ip mask):
sudo arp-scan --interface=eth0 192.168.1.0/24
Sorry, was a copy paste typo I meant 192.168.0.0.0/16 --interface=eth0 This should work right?
No. This is wrong.
192.168.0.0/16 --interface=eth0
This is correct.
Hi @p1r473 ,
Can you post what you are getting when running this in the container?
sudo arp-scan 192.168.0.0/16 --interface=eth0
Also, please let me know how long this took as you will have to adjust schedules based on that time.
Facing same issue. netalertx runs on 10.30.0.25. all our laptops run on 10.20.0.0/24. I would like to scan and add all devices on 10.20.0.0/24. running Running sudo arp-scan 10.20.0.0/24 --interface=ens33 --vlan=30 or 10.20.0.0/24 --interface=ens33 in netalertx and cli return this (arp-scan --localnet works and returns devices on 10.30.0.0): sudo arp-scan --interface=ens33 10.20.0.0/24 Interface: ens33, type: EN10MB, MAC: 00:50:56:b6:3c:d1, IPv4: 10.30.0.25 Starting arp-scan 1.10.0 with 256 hosts (https://github.com/royhills/arp-scan)
3 packets received by filter, 0 packets dropped by kernel Ending arp-scan 1.10.0: 256 hosts scanned in 2.404 seconds (106.49 hosts/sec). 0 responded
I can ping any device on 10.20.0.0 from 10.30.
I tried to follow you subnet documentation but no avail. From other sites, I take that arp-scan cannot scan other subnets.
Could you look into using nmap or another command to allow us to scan across subnets with netalertx?
Hi @babbahotep
Ping and ARPSCAN use different protocols so even if you can ping devices it doesn't mean arpscan can detect them.
You can try these inside of the container:
sudo arp-scan 10.20.0.0/24 --interface=ens33 --vlan=30
sudo arp-scan 10.30.0.0/24 --interface=ens33 --vlan=30
If the above doesn't work you need to use a different approach. These are described in this doc (including teh nMAP approach you refer to in your message):
https://jokob-sk.github.io/NetAlertX/REMOTE_NETWORKS/?h=remote
Please read the above documentation to explore alternative solutions for your setup. Happy to provided more guidance.