backup
backup copied to clipboard
Timeout while uploading to remote storage
I'm running Nextcloud Hub II (23.0.1 RC1) with Backup 1.0.4.
Creating a local backup works and occ backup:point:pack 202201... creates a pack.
When trying to use occ backup:point:upload 202201... I run into ConnectException cURL error 28: Operation timed out after 30000 milliseconds with 0 bytes received.
It seems to be kind of a server issue (https://github.com/nextcloud/server/issues/26071), since it ca be "resolved" by setting /path/to/nextcloud/lib/private/Http/Client/Client.php RequestOptions::TIMEOUT to a larger value (240s).
To link another of the countless related issues I came across: https://github.com/nextcloud/server/issues/18103
To me it looks like the "chunked" or whatever upload that is used by the web interface is not used by CLI operations. Since I have a rather slow (50 Mbit/s) DSL connection (on both server instances) it is not possible to transfer the 100MB files within the default 30s curl timeout.
As read in some other issue, I played around with the federated share:
- uploading ~100MB files via a link to the target server in web interface works nicely
- doing the same via the federated share (from the server I like to do the backup) I get the same 30s timeout.
Therefore, I don't really know if this repo is the correct place for the issue. Nevertheless, I guess it might serve as a workaround for others that run into the same problem.
Hello,
which kind of external storage are you using? Webdav?
Thanks
I'm using "Nextcloud" with user name and password.
I just found this issue that would have probably helped me as well: #141 (just setting to a smaller chunkk size)
Hello,
which kind of external storage are you using? Webdav?
Thanks
Greetings, I have the same issue using external storage with webdav
And it's not even limited to "slow" networks. We are sitting on a 1 Gbit-LAN and between the two Nextcloud-Instances that are federated is even a 10 Gbit connection, but I still can't download a 1.5 GB file via the federation with the same 30 second timeout in curl. (It manages about 1 GByte, then aborts the download, giving me a broken/truncated file).
snippet from the relevant line in the log-file:
Operation timed out after 30000 milliseconds with 1034205482 out of 1653575321 bytes received (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for https://federated.server/public.php/webdav/Folder/File","userAgent":"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:96.0) Gecko/20100101 Firefox/96.0","version":"22.2.3.0"
Same here with NC 24.0.4.1 and backup 1.1.2. A Gig internet connection between NC and Webdav server (both located in the same DataCenter). When starting a backup (both app_data and backup are using remote WebDAV), then fairly quick (after writing approx 10MB) this error occurs.
I am experiencing the same issue. I cannot upload my backup to my remote Nextcloud share.
Same Problem here
Hi, I have this issue too.