gitressa
gitressa
Great program, thanks for making it :+1: I second what @lpirl says, email notifications would be awesome, doing a manual check every day gets tedious.
I still think this would be a great feature :-) Extra info: Use `Ctrl+C` if you start the crawler in a bash script, and want to stop the crawler, but...
This is still relevant. Here is a list of URLs which have `"status": null`, yet the link doesn't work. Links are grouped by their `exception` message: * `Connection to ......
I have found this solution for filtering on multiple status values: `cat linkreport.json | jq -c '. | select([.status] | inside([301, 302, 400, 401, 403, 404, 406, 410, 416, 500,...
Yes, that works fine. But I want to target the link text, not the link. Perhaps check my example again? :-)
... or do you mean that `--exclude-url` can also target the link text, not just the URL?
Thanks for clarifying. Do you think it would be worth considering adding such a feature? A pretty good argument and use case for it, is if you have hundreds of...
Also adding support CSS selectors or XPaths as filters, on top of the link text would expand the flexibility, so that would be nice as well. So something like adding...
It would be immediately helpful if time was added to the timestamp like this: `Screenshot_yyyy-mm-dd_hh-mm-ss.png`
Maybe change "Addresses" to "Fixes" in first comment to link PR and issue?