node-scraper
node-scraper copied to clipboard
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
I have a recursive script running and after about 100 scrapes I always get:
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory
Initially I thought it was some JSON.stringify code that periodically saves the scrapped data to a text file but now I'm suspecting it's the scraper library. Have you experienced this at all?
Could be JSDOM / bug in Node.js itself having to do with setTimeouts: http://comments.gmane.org/gmane.comp.lang.javascript.nodejs/23373
I had a lot of issues using this module when scraping constantly. After about 3 minutes the script slowed down, misfired calls and eventually either quit with socket hang up error, or when that error was bypassed it quit with the error you mentioned above. I could not seem to find where my script was hanging inside of request until I saw this post...I did not suspect jsdom at all. After using a different dom library "cheerio" I am having no delays whatsoever, and my script has been plugging along happily.