Adrián Chaves
Adrián Chaves
If we decided to allow yielding items from `start_requests` (https://github.com/scrapy/scrapy/issues/5289), that would be the perfect chance to rename the method to something that makes more sense if items can be...
> My vote is to not put these requests to scheduler, and send them to downloader (with some caveats). > An alternative is to force in-memory queue for such requests...
I wonder if [scrapy-poet‘s implementation of web-poet’s additional requests](https://github.com/scrapinghub/scrapy-poet/pull/62) could help here. It does not send requests through the scheduler, though.
If the spider is only downloading content from 1 FTP server, lowering the Scrapy concurrency settings should do the job. For more complex scenarios, combining FTP requests to 1 domain...
As long as the work is done following the guidelines of the corresponding Wikipedia projects, I think this is OK and a nice goal.
That would be great! Do make sure that you follow the guidelines of the Hindi Wikipedia, though.
Good work! > Can I get the contributor badge? I must say I had not noticed them until now :facepalm: From [what I’m reading](https://github.blog/2017-07-25-making-it-easier-to-grow-communities-on-github/#contributor-badges), it does not seem to be...
@faizan2700 Mind that we are talking about the Wikipedia article here, not about the Scrapy documentation. The latter is not internationalized (i.e. translatable) yet, see https://github.com/scrapy/scrapy/issues/3511
Sure. Can you add a test for it? (and maybe remove the comment, seems unnecessary to me)
Why did you decide to propose this change? Do you use `execute` yourself, directly or indirectly? During our tests `settings` is never passed to this function, according to [coverage data](https://app.codecov.io/gh/scrapy/scrapy/blob/master/scrapy/cmdline.py)....