Pablo Hoffman
Pablo Hoffman
@kalessin check the code on `scrapy.http.request.form`, it has support for submitting values of clickable elements (which works like that by default unless `dont_click` is passed). You may be able to...
what's state is this at @Gallaecio?. Do you plan to finish it or should it be picked by someone else in the Crawlera team?. We're now working heavily in the...
Crawlera is still mentioned in many places of the README. Do we want to update all references?. Otherwise, should we add a note explaining what "Crawlera" is for people that...
How about we add a `:type` qualifier to `-s` and `-a` options? Here are some examples to illustrate its use: ``` scrapy crawl spider -a arg1:int=1 scrapy crawl spider -a...
@barraponto because the types of the arguments are not known until they're used in the spider. When command line arguments are parsed we still don't know what is the expected...
We can have both IMO, but we should open a separate ticket for the spider methods. So we're going to add `--set-json` and `--arg-json`?. Why not use json as a...
I discussed this with @kmike and we agree on moving forward with `--set-json` and `--arg-json`. We may add some shortcuts in the future (`--arg-list`?), if we agree they're needed, but...
what's the motivation behind this feature?
Thanks for clarifying @qrilka, but this PR is not about a generic crawlera header support (which I'd also like to see), it's about randomized UA header. Not so sure about...
thanks, but I still think we should add a command line flag to enable this. it could be enabled by default, but there should be a flag, not an always-on...