subjack icon indicating copy to clipboard operation
subjack copied to clipboard

Large subdomain list: Error: too many open files

Open PjMpire opened this issue 6 years ago • 17 comments

Im enumerating through a high number of subdomains from a list and when getting around halfway down the list, I get a message "too many open files " and the enumeration stops

image

image

PjMpire avatar Aug 29 '18 05:08 PjMpire

Hi @HellboundTKR,

I've ran into this problem in the past. To help you mitigate this issue I have a few questions,

    1. How big is your list?
    1. Assuming you're using mac or linux; can you run ulimit -n and provide me the output?

Thanks

haccer avatar Aug 29 '18 05:08 haccer

Also, I would like to suggest running: cat all_subdomains.lst | sort | uniq > new_all_subdomains.lst to remove duplicate subdomains (From the screenshot it looks like you have duplicates).

haccer avatar Aug 29 '18 05:08 haccer

Okay, another update.

In an old, old version of subjack, when I was using net/http instead of fasthttp, I had a Connection: close header set which mitigated this issue in the past.

I just pushed an update to readd this header to each request.

Please retest and confirm this is a working solution after you've:

  • Sorted your list to remove duplicate subdomains
  • Increased your file descriptors limit (if needed)

Thanks

haccer avatar Aug 29 '18 05:08 haccer

Hi,

The llst is around 100k subdomains. My OS is ParrotOS 64 ulimit - n was 1024

changed ulimit to unlimited but message still occuring even after patch

thanks

PjMpire avatar Aug 29 '18 11:08 PjMpire

@HellboundTKR that's very strange if it's still occurring after you set it to unlimited in the same session.

I'm attempting to reproduce this issue with a list containing ~246k subdomains.

➜  subjack git:(master) ✗ ulimit -n
4864

So far gone through, over 50k of those subdomains without any errors using the following command, similar to the one you posted above:

./subjack -w cname_list.txt -t 50 -o subjackresults.txt -ssl

Do you have an estimate of how many subdomains you're able to enumerate before you experience the 'too many files' error?

haccer avatar Aug 29 '18 16:08 haccer

Just an update, I surpassed 100k subdomains w/o any errors

haccer avatar Aug 29 '18 16:08 haccer

Perhaps this is a low memory / low cpu issue? I ran into the "too many files" a long time ago on my 1gb ram ubuntu box... I'm doing this current scan with the 246k subdomain list on my macbook pro. Are you running this on Parrot OS in a VM? With low memory/cpu?

A possible workaround would be to split the large file into chunks.

haccer avatar Aug 29 '18 16:08 haccer

can confirm it chugs through my list of 750k domains quite happily on a Kali VM with 2gb ram

AnotherWayIn avatar Aug 30 '18 13:08 AnotherWayIn

Yeh its strange, I get to about 80k domains when it stops.

Im using it in a Virtualbox VM with 4cores set 4gb memory using i7 4770k @4,5ghz

Thing is, I can chug through my list no problems when using SubOver tool.

PjMpire avatar Aug 30 '18 13:08 PjMpire

Subjack does make a lot more connections and requests than SubOver (which is based on an older version of Subjack) to accurately check for a possible subdomain takeover.

I’ll set up a Parrot OS VM this weekend, then run a series of tests including testing with the default Parrot OS instalization and testing with my suggested performance optimizations.

If anyone reading this issue is experiencing the same problem, please comment the OS, wordlist size, memory and CPU details and I will try to replicate that as well. Thanks.

haccer avatar Aug 31 '18 01:08 haccer

Same here, but I don't think is a VM problem. You can easily fix it adjusting -t flag. As per 90mb's fiber I'm using -t 150 but if I tune it up to 200 -> too many open files. IS there any other way to fix it? How may I speed up it?

HeisenbugHQ avatar Sep 05 '18 11:09 HeisenbugHQ

@HeisenbugHQ well, 200 threads is a lot.... I don't recommend going past 100, it's important to keep in mind that the more you increase, the harder your machine is going to have to work.

--

@everyone The underlying issue here is that the 'too many open files' error occurs when there's too many connections open.

I've done as much research as I could, the only solution for the 'too many open files' error is to raise the ulimit (ulimit -n unlimited).

Taking all of this into consideration, the only solutions to remediate this issue are:

  • Make Subjack slower (Cap number of connections being made) // Which I don't plan on doing.
  • Properly, raise your box's ulimit
  • Use on a better box

haccer avatar Sep 05 '18 13:09 haccer

Hi guys! I having the same issue with a list of 14000 host

  1. ulimit is set to unlimited
  2. my box have 4 gb ram 2 cores 50 ssd
  3. This happen with 30 threads

hdbreaker avatar Jul 13 '19 22:07 hdbreaker

I don't know why this happens. i had no errors with a list of +1 millions subdomains at 30-50 threads. but today i tried again at 80 threads with a list of 100k subdomains and got this error.

I will try again with no threads at all and update this comment.

EDIT: Apparently if you just lower the threads you will run it just fine. play with it until you have no errors.

marcelo321 avatar Dec 25 '19 03:12 marcelo321

I ran into a similar but different issue while leveraging this tool. The error I ran into was specifically around fingerprints.json: too many open files. I've created a PR which seems to address this issue and may address other folks' issues as well.

https://github.com/haccer/subjack/pull/49

zeknox avatar Feb 04 '20 03:02 zeknox

same error too many files to open. Any updates now.

abi1915 avatar May 08 '20 17:05 abi1915

same error too many files to open. Any updates now.

Sen türkmüsün?

Phoenix1112 avatar May 08 '20 17:05 Phoenix1112