Add links to supported products/protocols to the plugin README files
Since the "Eurolite freeDMX Wi-Fi" is actually out-of-production and seems to be replaced by the "eurolite freeDMX Wi-Fi AP" (which is a completely different product and doesn't reference OLA in its manual), I would make this (and probably the other product references as well) a link to its product page. Such as:
Originally posted by @kripton in https://github.com/OpenLightingProject/ola/pull/1826#discussion_r1125681281
A few other things we'll need to consider/do:
- [ ] Consider if this actually makes sense, how often is OLA run offline
- [ ] Check how it renders in https://docs.openlighting.org/ola/man/man1/ola_plugin_info.1.html on the CLI
- [ ] Add some sort of link checking as a GitHub action so broken links are flagged up (we get quite a few of these on the website tooling
how often is OLA run offline
Not sure if this is helpful, but I often have installations where the machine is intentionally cut off from the internet after install.
Add some sort of link checking as a GitHub action so broken links are flagged up (we get quite a few of these on the website tooling
We should maybe preemptively save pages that we link to (and any associated datasheets, manuals, documentation, PDFs, etc.) to the Internet Archive so that we can fall back to it once the links die. We could probably have the GitHub Action additionally check that the URL exists in the Internet Archive.
Not sure if this is helpful, but I often have installations where the machine is intentionally cut off from the internet after install.
Yeah I figured that might be the case. It sounds like the counter to that is that it may well be online before the install? I guess the next logical question is would links such as these be useful while it is online?
Add some sort of link checking as a GitHub action so broken links are flagged up (we get quite a few of these on the website tooling
We should maybe preemptively save pages that we link to (and any associated datasheets, manuals, documentation, PDFs, etc.) to the Internet Archive so that we can fall back to it once the links die. We could probably have the GitHub Action additionally check that the URL exists in the Internet Archive.
That's an excellent idea! I wonder if we could even automatically push them to the Internet Archive if they're missing too? Or at least generate an easy list of what needs adding...
Yeah I figured that might be the case. It sounds like the counter to that is that it may well be online before the install? I guess the next logical question is would links such as these be useful while it is online?
Realistically I would just look up whatever I need to on my laptop that accompanies me for any work like this, so the Internet Archive is maybe the best bet.
That's an excellent idea!
There does appear to be an API with both command line and Python interfaces! https://archive.org/developers/
I'll add this to the growing list in #1815 as this script would definitely end up in the CI.
I wonder if we could even automatically push them to the Internet Archive if they're missing too? Or at least generate an easy list of what needs adding...
We can definitely generate a list and provide GH Actions Annotations. I think that we would probably want the script run locally to actually induce a scrape and have the CI simply check that a scrape exists since it could take many minutes to complete. That way the pull request CI runs are not making any changes anywhere and are just making checks (as is usually expected). Alternatively, we could have a CI script that runs only on release that scrapes everything.