Per-domain scrape rule
Google News RSS feed contains link to different sites and they all need different scrape rule.
Example of such an RSS feed: https://news.google.com/news/rss/headlines/section/geo/SanFrancisco
I guess what I actually want is https://github.com/miniflux/miniflux/blob/master/reader/scraper/rules.go but as a flag or something that doesn't require upstreaming the changes first and wait for the next release.
I guess we can have an ENV, like: SCRAPER_RULES="path/to/rules.json",to make it configurable to users.
You can still define scraper rules for each feed via the user interface (edit feed page).
Yes you can define scrape rule per-feed but not per-domain. If you check the RSS feed in the original post it contains posts from different domain.
Ok, I see.
Not sure this is better suited to something like RSS Bridge. This gets hairy fast if you try to correctly parse the entire Internet.
https://github.com/RSS-Bridge/rss-bridge