website
website copied to clipboard
Set up a watcher for link rot
At least some of the links referenced by the documentation will inevitably die.
We could use something like https://github.com/ephys/puppeteer-crawler to be notified when a link becomes a 404.
I still need to fix some issues with that script first.
We should also not check pages under /v2, /v3, /v4, and /v5 since these versions are not maintained anymore and it would skyrocket the time needed to check.
We probably can do this with a github action that runs once a week.
Can we also do some analytics on what links people try to go to? Especially if people get a 404 so we can see what broken links float around on the internet