External links should not be clickable
I remember having this discussion before but I can't find any past issue putting this is writing.
Each website that we scrape has a number of links going to external resources that will not be scraped (for a good example try https://library.kiwix.org/viewer#lua.org_en_all/www.lua.org/about.html) - when truly offline, these links return an error message.
Unless they are extremely careful (and not on a mobile device), users have no way of knowing in advance that clicking on a link will lead them nowhere. Considering our primary use case accounts for no connectivity whatsover, this leads to poor UX and should be prevented.
MWoffliner already has such a behaviour that red links (or internal links in the case of a subset) are turned to regular text. I believe this should also be the case here.
Once again, I advocate for a general decision turned into a documented behavior/policy.
FFT:
- There's meaning in something being a link (whether or not you can access it).
- wikipedia has this concept of external links (via an icon) that's very practical. We've copied it on some scrapers (sotoki at least)
- it would be fragile to implement such a change to a generic scraper like zimit because we don't control the rendering at all.
- we have an external link blocker in kiwix-serve. this could be implemented in all scrapers.
We need a general decision on this, based on a live discussions of pros and cons.
we have an external link blocker in kiwix-serve. this could be implemented in all scrapers.
I suspect you meant "all readers"?
Yes :)
Well depending on the solution it could be both
This is a duplicate of https://github.com/openzim/warc2zim/issues/65
Closing this in favor of https://github.com/openzim/warc2zim/issues/65