osv-scanner
osv-scanner copied to clipboard
too many open files
sharing because it may warrant a mention in the readme or other response
despite increasing ulimit repeatedly there keeps a 'too many open files' happening when scanning the /home folder
ulimit was 1024 ulimit is now 4096
once 'ulimit -n' is over 3072 there are no immediate messages on 'too many open files'
hmm I feel like this is something that should be expected whenever you're running any tool recursively against a potentially large number of files (which is what I assume you're doing, though you've not explicitly stated the exact command you ran) which has an easily discovered solution, so I'm not sure it merits being mentioned in the readme.
I'm pretty sure that the scanner should only ever have a few files at a time since it reads everything into memory as it goes, and not in parallel, so there could be a bug that's causing it to hold files open longer than it needs - however I've not been able to reproduce this error myself despite running the scanner recursively against a number of directories, and with the default 1024 ulimit.
Could you provide some more info about your setup, including OS, contents of your /home
directory, how you're running the scanner, etc?
Hello @G-Rath ,
Is it possible that function scanSBOMFile, called frequently during folder scan, is missing a defer os.Close()?
(in the coming days) I can try to replicate error then see if this correct and open a PR.
Regards.
@cmaritan yup that could do it - likewise we're also not explicitly calling file.Close
for tryLoadConfig
and FromCSVFile
(though that function isn't used yet).
I'm happy to do a PR right now adding that for at least the first two since it's a good practice either way, but if you don't mind seeing if those are actually leading to that issue, that would be useful info too.
FWIW: I've ran into this 'issue' als scanning other folders which are not particularly deep, such as the .cache folder for the Brave browser under /home, such as a sourcecode repo. The system i'm running osv-scanner on is a RHEL9(.1) with selinux enabled.
@commandline-be if you've comfortable with compiling Go binraries, could you see if the issue still happens with https://github.com/google/osv-scanner/pull/106?
It feels weird that I'm not able to reproduce this at all, since my home directory has a wide range of stuff including caches for yarn
and npm
which are very large, all my git repos/projects, etc - I wonder if this is a RHEL9/selinux thing? I'm running on Ubuntu 20.04 via WSLv2.
Hello, still not able to reproduce. Tried v1.0.2 with Fedora Core and SELinux enabled but everything seems ok with 1024 open files limit. Used Chrome cache folder and some git repo folder with >30K files and about 20 lockfiles effectively processed. @commandline-be can you try to run the command disabling SELinux?
@commandline-be are you still running into this after https://github.com/google/osv-scanner/pull/106 ?
Closing due to lack of activity.
Closing due to lack of activity.
Sorry for the due response, I've not encountered the issue since.