node-webcrawler
node-webcrawler copied to clipboard
Crawler is a web spider written with Nodejs. It gives you the full power of jQuery on the server to parse a big number of pages as they are downloaded, asynchronously
with this code ``` var Crawler = require("node-webcrawler"); var url = require('url'); var jsdom = require('jsdom'); var c = new Crawler({ maxConnections : 10, jQuery: jsdom, // This will be...
is there any method/options that i can use to deal with async loading data? i used casperJS before, the "wait" methods are very useful. http://docs.casperjs.org/en/latest/modules/casper.html#wait plz advise, thx.
If I have some error with some tasks, the crawler won't fire up the onDain event.