uniclust-pipeline
uniclust-pipeline copied to clipboard
uniclust30 download mirror?
Greetings,
Is there a mirror for the database downloads? I'm attempting to download the uniclust30 for hhsuite using the link from the uniclust.mmseqs.com site, but the download is exceedingly slow and fails after reaching ~16gb.
Thanks!
Could you please try if the following url works better: http://wwwuser.gwdg.de/~compbiol/uniclust/2017_10/
I'm sorry, I should have been more specific, that's the directory that uniclust.mmseqs.com directs me to. This is a direct link to what I have been attempting to download that ultimately fails:
http://wwwuser.gwdg.de/~compbiol/uniclust/2017_10/uniclust30_2017_10_hhsuite.tar.gz
uniclust.mmseqs.com links to a different server (subdomain gwdu111). Does the wwwuser server work better/differently?
Unfortunately no, not that I can see. But I may be misunderstanding, because uniclust.mmseqs.com appears to already link to the wwwuser server, and that's what I already attempted to download.
Okay now I see the source of confusion, the link in the text points to: http://wwwuser.gwdg.de/~compbiol/uniclust/2017_10/
The link in the header points towards: http://gwdu111.gwdg.de/~compbiol/uniclust/2017_10/
Try the other one and see if it improves the situation. We don't have other servers. wget/curl support continuing previous downloads, maybe try that?
The gwdu111 subdomain link worked great! Thanks very much for looking into this for me.
This is an old issue but I am currently trying to download from the gwdu111 link and I am getting less than 100 KB/s download speed. Have tried on 2 different connections and both are slow. Any plans to host them on high bandwidth servers?
Thanks.
It took a day for me to download the hh-suite archive. My guess is something is wrong with the server.
I get around 1MB/s even on good networks. Would it make sense to mirror the database to github (as release-attached file), or to zenodo?
I would recommend to try to download with aria2c
. It offers the option to use multiple simultaneous connections to download (-x
or --max-connection-per-server
). That might speedup the download.
Zenodo might work, but we are already very close to the 50GB limit.
I would recommend to try to download with
aria2c
. It offers the option to use multiple simultaneous connections to download (-x
or--max-connection-per-server
). That might speedup the download.Zenodo might work, but we are already very close to the 50GB limit.
I am downloading this. It is at 3.7 MB/s with total wait time of 1.5 hours