fbcrawl
fbcrawl copied to clipboard
csv file limit
Hi, I'm crawling the comments of one of famous guy's post.
Some comments are up to 50000+ and I found that the scrapy engine would stop to crawler when it crawls at about 10000 comments. I checked my fake fb account and it was not been banned. Is there any limit at the csv file? If it is, it there any method to solve it.
Really appreciate about this tool.
yes and you can change it:
go to line 75 in fbcrawl.py , increase 10e5 to 10e6 (a million) or whatever you want...