node-scraper icon indicating copy to clipboard operation
node-scraper copied to clipboard

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory

Open krunkosaurus opened this issue 12 years ago • 2 comments

I have a recursive script running and after about 100 scrapes I always get:

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory

Initially I thought it was some JSON.stringify code that periodically saves the scrapped data to a text file but now I'm suspecting it's the scraper library. Have you experienced this at all?

krunkosaurus avatar Aug 23 '12 22:08 krunkosaurus

Could be JSDOM / bug in Node.js itself having to do with setTimeouts: http://comments.gmane.org/gmane.comp.lang.javascript.nodejs/23373

krunkosaurus avatar Aug 26 '12 07:08 krunkosaurus

I had a lot of issues using this module when scraping constantly. After about 3 minutes the script slowed down, misfired calls and eventually either quit with socket hang up error, or when that error was bypassed it quit with the error you mentioned above. I could not seem to find where my script was hanging inside of request until I saw this post...I did not suspect jsdom at all. After using a different dom library "cheerio" I am having no delays whatsoever, and my script has been plugging along happily.

nickewansmith avatar Jun 13 '13 17:06 nickewansmith