mainline
mainline copied to clipboard
Can't download kernel package files.
Stalls on download of new kernels. Doesn't move from there.
KDE Neon 6.3 (Ubuntu 24.04 LTS based) distro.
The last one installed successfully was 6.13.5
TLDR:
The server hosting the kernel packages is sick at the moment. (I don't run that and can't fix it.)
Give it a day and try again later.
TLDR2:
Kinda-sorta almost work-around if you can't "give it a day".
Reduce "concurrent downloads" from the default 4 to 1
Settings -> Network -> Concurrent Downloads: 1
Then after that, it may still fail but just retry and it should work within one or two retries if not the first time.
Each individual file may fail, but each time you retry you may get one or two more files into the cache and once you have all 4 files the kernel will finally install.
DETAILS:
The server is giving status 410 errors.
410 means "gone", meaning the file has been removed from the server.
This is not true of course. Generally downloading the same url in a browser or with something else like wget works.
The files are really there on the server and can be downloaded, just not reliably.
I think the server might be heavily loaded at the moment or something, and may get better naturally later.
I have found that if you use a browser or wget to download the same urls that mainline is trying to download, then it usually works, but the specific aria2c command that mainline is using usually gets the 410 error.
But if you remove the concurrent connections and concurrent downloads options, then it works at least half the time.
Example command that mainline uses normally (just to download the CHECKSUMS file in this case):
$ aria2c --input-file=- --no-netrc=true --no-conf=true --summary-interval=1 --auto-save-interval=1 --enable-color=false --allow-overwrite --max-file-not-found=3 --retry-wait=2 --show-console-readout=false --download-result=full --human-readable=false --connect-timeout=15 --max-concurrent-downloads=4 --max-connection-per-server=4 <<%%EOF
https://kernel.ubuntu.com/mainline/v6.14/amd64/CHECKSUMS
gid=923d4c0000000000
dir=/home/bkw/.cache/mainline/6.14
out=CHECKSUMS
%%EOF
This fails almost every time.
But just removing --max-concurrent-downloads=4 --max-connection-per-server=4 without changing anything else
makes it work most of the time, though still not every time.
If it's still doing this after a day than I'll try to contact the server admins and ask if there is maybe deliberate blocking or something, or maybe they don't know it's doing it, or see if they want me to do something different to be nicer to the server etc. And/or I'll work on adding a config option to use wget or curl instead of aria2c etc.
But for the moment I'm waiting to see if it's just a transient server issue that goes away.
I can't even get it to work reliably with the current firefox... but: I can get it to work with curl-impersonate 100% of the time. Works after patching mainline, too. But before fixing it like that it might be a good idea to ask the server guys if whatever they are doing is related to the AI scraper scourge or because so many people are hammering the poor server using this tool...
https://app.element.io/#/room/#kernel:ubuntu.com
They characterize the traffic as "scanning the site continuously" (it does not, so if that is happening, then it must be someone else) so yeah they're intentionally blocking this app specifically. Cool.
I guess this project just has grown so popular that the userbase as a whole downloads directory listings at a rate that has grown noticeable. A workaround might be to cache a cached index within the scope of this project. If no separate server is available, a (github-pages) mirror within this very repo (or an assets repo, similar to uAssets) could probably work. Kernel installs would of course still lead to traffic, but at a much lower request frequency (or how often do we actually install Kernels?).
1.4.13 Added a configuration option for the user-agent string and defaulted to fake firefox.
To change the default:
Grab any one from here: https://www.useragentstring.com/pages/useragentstring.php
And paste it in to: Settings -> Network -> User Agent String
This is temporary and dickish and I don't like it.
It will be trivial for them to block again. Even if we randomize the string from a big list of known strings, they can just block by counting the connections and blocking anyone that downloads more than 1 or 2 index.html or something.
I wish they would answer the question "What do you want me to do, and I'll do that." They are the ones who don't provide the info in any other form than a hundred seperate index.htmls.
UPDATE
user Nils on Matrix has graciously provided a cloudflare-backed mirror of the mainline-ppa site, which can be dropped right in to the existing config option for the mainline ppa url. You don't even need to update the app, the option already existed to allow for company internal mirrors or whatnot.
Go to Settings -> Network -> Mainline-PPA URL
And paste in https://mainline.teamsforlinux.de/mainline/
It's harmless to edit the field. Just blank the field out entirely to get the built-in default value back any time.
This is most likely going to be temporary. If we will use a mirror by default from now on, then it should probably have a better domain name than something originally made for the ms teams client, and I need to do something about trusting the .deb files.
There may be a way to verify the index.html and CHECKSUMS files without having to download them repeatedly from the source site, (gpg signatures I think?) but even failing that, it's possible the original site admins may be convinced to lift the ban if cloudflare (or other CDN) will handle 99% of files and the app just downloads the CHECKSUMS directly from the main site. The checksums files are only downloaded when a user wants to actually install a kernel, not just every time the status of kernels is scanned. That is almost no traffic at all.
Anyway, I don't know what all the details will be yet. But for right now, this is something any user can do, immediately without even updating, and it's NOT "dickish and I don't like it". It's perfectly fine and not abusing or faking anything. It's a good answer to the problem.
It's just not a complete and finished answer because it means users are downloading kernel packages from a 3rd party source. That can still be fine as long as the checksums can be verified, but the checksums files themselves are also coming from the same 3rd party source. So what's still needed for a full proper answer is to either be able to validate the checksums and index.html files themselves with gpg signatures, or download everything BUT the checksums files from the mirror, and download just the checksums from the original site, if the site admins agree.
Ok I have verified that it should be possible to build-in verification of the checksums by gpg.
The CHECKSUMS files have an attendant CHECKSUMS.gpg file. The gpg file can be verified against a public key from an ubuntu.com key server. So that is how you trust the .deb files. The .deb files are proven by the checksums, the checksums are proven by gpg, and the gpg key comes from keyserver.ubuntu.com .
The point is, none of this requires hitting kernel.ubuntu.com. The mirror supplies the html, deb, checksums, and gpg files, and keyserver.ubuntu.com supplies the gpg key.
Previously I didn't worry about verifying the checksums themselves even though we were verifying the .debs with the checksums, because we were downloading the checksums from an ubuntu.com domain via https. Now that that's not always true any more, we need to go that next step.
Below is just to document how it goes. I will build this into the app and I think users will not have to do anything.
The package just gets a new dependency added for the gpg package.
download CHECKSUMS
bkw@fw:~/tmp/mainline$ wget https://mainline.teamsforlinux.de/mainline/v6.13.8/CHECKSUMS
--2025-04-04 20:52:19-- https://mainline.teamsforlinux.de/mainline/v6.13.8/CHECKSUMS
Resolving mainline.teamsforlinux.de (mainline.teamsforlinux.de)... 104.21.6.196, 172.67.135.57, 2a06:98c1:3121::3, ...
Connecting to mainline.teamsforlinux.de (mainline.teamsforlinux.de)|104.21.6.196|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 245
Saving to: ‘CHECKSUMS’
CHECKSUMS 100%[====================================================================================================================================================================================================>] 245 --.-KB/s in 0s
2025-04-04 20:52:20 (128 MB/s) - ‘CHECKSUMS’ saved [245/245]
download CHECKSUMS.gpg
bkw@fw:~/tmp/mainline$ wget https://mainline.teamsforlinux.de/mainline/v6.13.8/CHECKSUMS.gpg
--2025-04-04 20:52:24-- https://mainline.teamsforlinux.de/mainline/v6.13.8/CHECKSUMS.gpg
Resolving mainline.teamsforlinux.de (mainline.teamsforlinux.de)... 104.21.6.196, 172.67.135.57, 2a06:98c1:3121::3, ...
Connecting to mainline.teamsforlinux.de (mainline.teamsforlinux.de)|104.21.6.196|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 488
Saving to: ‘CHECKSUMS.gpg’
CHECKSUMS.gpg 100%[====================================================================================================================================================================================================>] 488 --.-KB/s in 0s
2025-04-04 20:52:25 (28.9 MB/s) - ‘CHECKSUMS.gpg’ saved [488/488]
try to verify gpg signature
bkw@fw:~/tmp/mainline$ gpg --keyid-format long --verify CHECKSUMS.gpg CHECKSUMS
gpg: Signature made Sat 22 Mar 2025 05:00:16 PM EDT
gpg: using RSA key 60AA7B6F30434AE68E569963E50C6A0917C622B0
gpg: Can't check signature: No public key
The first time you ever try, it will fail as above, but it tells you the key it needs.
import that key from keyserver.ubuntu.com
bkw@fw:~/tmp/mainline$ gpg --keyid-format long --keyserver hkp://keyserver.ubuntu.com --recv-keys 0x60AA7B6F30434AE68E569963E50C6A0917C622B0
gpg: key E50C6A0917C622B0: public key "Kernel PPA <[email protected]>" imported
gpg: Total number processed: 1
gpg: imported: 1
Now try the same verify again
bkw@fw:~/tmp/mainline$ gpg --keyid-format long --verify CHECKSUMS.gpg CHECKSUMS
gpg: Signature made Sat 22 Mar 2025 05:00:16 PM EDT
gpg: using RSA key 60AA7B6F30434AE68E569963E50C6A0917C622B0
gpg: Good signature from "Kernel PPA <[email protected]>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg: There is no indication that the signature belongs to the owner.
Primary key fingerprint: 60AA 7B6F 3043 4AE6 8E56 9963 E50C 6A09 17C6 22B0
bkw@fw:~/tmp/mainline$
One of the site admins said they may generate some kind of single json file that can take the place of hundreds of seperate index.html files. If they do that, then we won't need a mirror any more. A single json download per user would be about the same number of bytes, but their logs won't look as alarming. No promises and no timeline but they sound like they'll probably do something.