Odd `rotating-url` behavior?
On the Discord repository, there have been many pull requests made by this bot that simply change the URL of a .tar.gz source back and forth, even though its sha256 sum never changes:
- https://github.com/flathub/com.discordapp.Discord/commit/013293f3f383c0393cf7f11e45fa8cb475a92ef9
- https://github.com/flathub/com.discordapp.Discord/commit/dc25c1480f94da9b17172fbbf003e54222ec33f5
- https://github.com/flathub/com.discordapp.Discord/commit/1c2cc708c526c8ad45273f824cf96a4d2055992d
- https://github.com/flathub/com.discordapp.Discord/commit/c736f4238d2a70f8a5af1ecd9789bbb3719df8ea
- https://github.com/flathub/com.discordapp.Discord/commit/05785b4a47948326f492a120b3a766ccff0225c0
Additionally, the metainfo file is getting many duplicated <release> tags.
Maybe the bot shouldn't open a pull request if the hash of a source didn't change and it's not an extra-data type?
The War Thunder flatpak has the same issue, that forced them to disable auto-merge
Reference #373
On the Discord repository, there have been many pull requests made by this bot that simply change the URL of a
.tar.gzsource back and forth, even though its sha256 sum never changes:
The way the rotating-url checker works is that it fetches the given URL and keeps following redirects until it finds an actual file. Evidently sometimes this redirects to a URL that does not match the given pattern.
Maybe the bot shouldn't open a pull request if the hash of a source didn't change and it's not an
extra-datatype?
I would prefer "if the hash of a source didn't change and the old URL still resolves". Alternatively you might consider "ignore redirects that don't match the given pattern".
another pr on discord's side xd https://github.com/flathub/com.discordapp.Discord/pull/444
@wjt
"if the hash of a source didn't change and the old URL still resolves"
This would mean that the file always have to be downloaded even if a version pattern is given. Although looking at the current code, this is the current behavior, this is not really a great solution for bigger applications to be re-downloaded for each check. Nevertheless, I created a PR that uses this solution: #439
Alternatively you might consider "ignore redirects that don't match the given pattern".
If we ignore those redirects, we might miss new releases. Especially in use cases where the provider will only use those URLs (01.download.example.com, 02.download.example.com) instead.