Peter Krantz
Peter Krantz
:-) Well, I use it for mass archiving of URLs collected by a custom crawler from js heavy websites. It does its job. For regular archiving see [Heritrix](https://github.com/internetarchive/heritrix3). Sorry if...
You could check out the archiving component of warcworker - [Squidwarc](https://github.com/N0taN3rd/Squidwarc) - it has settings that may help you in archiving more links of a website (see Page + Same...
If I remember correctly it only captures the current page and all the links from that page so it will not capture an entire website. If the website you are...
Closing
I tried to reproduce this error but could not. My docker environment is on Windows 10 with the Linux subsystem. I set ports to `4000:1337` and `BASE_URL=http://localhost:4000` `docker ps` gives...
Bagit is included in other build chains. Knowing that bagit follows some of the OpenSSF practices would make it easier to trust the project. I understand if it feels cumbersome...