desktop icon indicating copy to clipboard operation
desktop copied to clipboard

[Bug]: Apple File Provider with bigger files result in NSFileProviderErrorDomain (-2005)

Open krim404 opened this issue 2 years ago • 5 comments

⚠️ Before submitting, please verify the following: ⚠️

Bug description

when adding a bigger file (in my case about 1GB in size) on the Apple File Provider the file will immediately get a cloud symbol with an exclamation mark with the error "NSFileProviderErrorDomain-Error -2005".

Works flawless when syncing or manually uploading the file.

Steps to reproduce

  1. Enable Apple File Provider
  2. Drag file in file provider

Expected behavior

file should be uploaded

Which files are affected by this bug

Sorbet_Plus.zip

Operating system

Mac OS

Which version of the operating system you are running.

13.4

Package

Appimage

Nextcloud Server version

26.0.2

Nextcloud Desktop Client version

3.9-RC1

Is this bug present after an update or on a fresh install?

Fresh desktop client install

Are you using the Nextcloud Server Encryption module?

Encryption is Enabled

Are you using an external user-backend?

  • [ ] Default internal user-backend
  • [ ] LDAP/ Active Directory
  • [ ] SSO - SAML
  • [ ] Other

Nextcloud Server logs

log doesnt contain anything

Additional info

No response

krim404 avatar May 27 '23 08:05 krim404

Bildschirmfoto 2023-05-27 um 10 34 29

krim404 avatar May 27 '23 08:05 krim404

Hi, more information is needed in order to debug this -- could you provide some logs?

The file provider module uses Apple’s unified logging system, so you can extract relevant logs using the Console app or by using the log utility from the terminal. To make it easier to find the Nextcloud File Provider related logs, filter by process FileProviderExt and by subsystem com.nextcloud.cloud.FileProviderExt; if you're using the Console app, make sure to check "Include info messages" and "Include debug messages".

Screenshot 2023-03-21 at 16 55 06

Screenshot 2023-03-21 at 16 58 07

claucambra avatar May 27 '23 15:05 claucambra

error is simply "file too large"

standard	18:42:16.200067+0200	fileproviderd	┳110e1b ✅  done executing <FS1 ✅  fetch-content(docID(450698)) why:itemChangedRemotely sched:utility#1685205736.1839154> →  <item:<s:docID(450698) p:root n:"S{9}s.zip" doc sz:1420777599 m:rw- ct:1684362621.4011796 mt:1684362927.5042572 qtn v:sver:root/S{9}s.zip cver:104558555@1:sz:1420777599> content:fid(104558557) unchanged:false>
standard	18:42:16.200746+0200	fileproviderd	 ✍️  persist job: <J1 ⏳  create-item(propagated:<docID(450698) dbver:0 domver:<nil>>) why:itemChangedRemotely|contentUpdate sched:utility#1685205736.1839154 ⧗persisted>
standard	18:42:16.200789+0200	fileproviderd	┗110e1b
standard	18:42:16.209678+0200	FileProviderExt	container_create_or_lookup_app_group_path_by_app_group_identifier: success
standard	18:42:16.210041+0200	FileProviderExt	(501) Adopting Voucher for accountID:F82E00EE-8357-4165-84A9-100076B145A4
standard	18:42:16.210194+0200	FileProviderExt	(501) No Cached Copy of voucher for Account:F82E00EE-8357-4165-84A9-100076B145A4, generating one from usermanagerd
standard	18:42:16.210248+0200	FileProviderExt	(501) kernel voucher port is :206927
standard	18:42:16.211819+0200	FileProviderExt	(501) retrieveReplacementVoucherFor failed with error:Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"
standard	18:42:16.211884+0200	FileProviderExt	(501) Adopting Voucher for accountID:F82E00EE-8357-4165-84A9-100076B145A4
standard	18:42:16.211909+0200	FileProviderExt	(501) No Cached Copy of voucher for Account:F82E00EE-8357-4165-84A9-100076B145A4, generating one from usermanagerd
standard	18:42:16.211924+0200	FileProviderExt	(501) kernel voucher port is :206931
standard	18:42:16.212511+0200	FileProviderExt	(501) retrieveReplacementVoucherFor failed with error:Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"
standard	18:42:16.215545+0200	FileProviderExt	[WARNING] <private> must be called with a task in suspended (1) state, but task <private> has state 0. NSFileProviderManager will suspend the task and resume it again to work around this. To avoid this warning, resume the task from the completion handler.
standard	18:42:16.215542+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> resuming, timeouts(60.0, 604800.0) QOS(0x11) Voucher <private>
standard	18:42:16.215589+0200	FileProviderExt	(501) Adopting Voucher for accountID:F82E00EE-8357-4165-84A9-100076B145A4
standard	18:42:16.215607+0200	FileProviderExt	(501) No Cached Copy of voucher for Account:F82E00EE-8357-4165-84A9-100076B145A4, generating one from usermanagerd
standard	18:42:16.215626+0200	FileProviderExt	(501) kernel voucher port is :327831
standard	18:42:16.215954+0200	FileProviderExt	[Telemetry]: Activity <nw_activity 12:2[FD6FE0D5-B040-4610-A79E-38723B38EF85] (reporting strategy default)> on Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> was not selected for reporting
standard	18:42:16.216101+0200	FileProviderExt	(501) retrieveReplacementVoucherFor failed with error:Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"
standard	18:42:16.216516+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> {strength 1, tls 8, sub 0, sig 0, ciphers 1, bundle 0, builtin 0}
standard	18:42:16.216566+0200	FileProviderExt	[C112] event: client:connection_reused @1602.224s
standard	18:42:16.216906+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> now using Connection 112
standard	18:42:16.233359+0200	fileproviderd	[NOTICE] ⏱  com.apple.fileprovider.indexing: new watcher registered for c{51}m.dev
standard	18:42:16.233402+0200	fileproviderd	[NOTICE] ⏱  com.apple.fileprovider.indexing: registering xpc_activity
standard	18:42:16.243324+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> received response, status 413 content K
standard	18:42:16.243359+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> done using Connection 112
standard	18:42:16.243396+0200	FileProviderExt	[C112] event: client:connection_idle @1602.251s
standard	18:42:16.243530+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> response ended
standard	18:42:16.243880+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> summary for task success {transaction_duration_ms=27, response_status=413, connection=112, reused=1, request_start_ms=0, request_duration_ms=0, response_start_ms=26, response_duration_ms=0, request_bytes=65733, response_bytes=289, cache_hit=false}
standard	18:42:16.243985+0200	FileProviderExt	Task <C3620D76-DA69-4D15-AD2F-DC1CFCA1435C>.<1602> finished successfully
fehler	18:42:16.245987+0200	FileProviderExt	Could not upload item with filename: Sorbet_Plus.zip, received error: The file is too large

krim404 avatar May 27 '23 16:05 krim404

Is this issue still relevant using 3.13.0-macOS-vfs?

marcotrevisan avatar Apr 30 '24 09:04 marcotrevisan

Yes, it is. I am experiencing this issue too on MacOS Sonoma with 3.13.0. However it's interesting to see that the file is however actually uploaded to the server, but it shows an error on Finder.

Javihache avatar Jun 21 '24 17:06 Javihache

Yes, it is. I am experiencing this issue too on MacOS Sonoma with 3.13.0. However it's interesting to see that the file is however actually uploaded to the server, but it shows an error on Finder.

Same here. Im on desktop version 3.13.2-macOS-vfs. Server is on version 28.0.7. The file is uploaded to the server (I see it in the network usage on the Mac and on the server), but after the upload is finished, the Finder reports the error (NSFileProviderErrorDomain error -2005) and the file is not displayed in the cloud (web interface).

Beo-Coder avatar Jul 09 '24 12:07 Beo-Coder

Looks like there's a limitation in the request size in PHP ini and/or in the web server configuration. The stricter of the two applies. Uploads from the macOS-vfs client are not done using chunked IO yet, for what I can tell, so it's important to check for your request limits at server side (request size and timeouts too). Hope this helps, cheers!

marcotrevisan avatar Jul 12 '24 10:07 marcotrevisan

I'm assuming the file won't show up in Nextcloud Files web interface either...

marcotrevisan avatar Jul 12 '24 10:07 marcotrevisan

Looks like there's a limitation in the request size in PHP ini and/or in the web server configuration. The stricter of the two applies. Uploads from the macOS-vfs client are not done using chunked IO yet, for what I can tell, so it's important to check for your request limits at server side (request size and timeouts too). Hope this helps, cheers!

I changed the PHP memory limit and now I don't get this error anymore. But when I try to upload a file > 1 GB, I get an error BadRequest Expected filesize of 2048000000 bytes but read (from Nextcloud client) and wrote (to Nextcloud storage) 1161461760 bytes. Could either be a network problem on the sending side or a problem writing to the storage on the server side.

I have tried a lot of different configs with different memroy and upload sizes. Apache is also configured with an unlimited LimitRequestBody. Also my ngnix reverse proxy should be configured correctly with client_max_body_size 0; It works when I upload the file via the webinterface (Probably because the file gets chunked?)

Do you have any idea what could be wrong?

Beo-Coder avatar Jul 12 '24 18:07 Beo-Coder

Timeouts are just as important as sizes. A snippet from my nginx conf: ...

    client_max_body_size 0;
    proxy_buffering off;
    proxy_redirect off;
    proxy_connect_timeout 1000;
    proxy_read_timeout 1000;
    proxy_send_timeout 1000;

Hope this helps. Regards!

marcotrevisan avatar Jul 13 '24 12:07 marcotrevisan

There are other important things like buffering and other stuff that come in the way when you allow for very big uploads. Every site has its own needs and there's no one-fits-all configuration. In general I'd suggest you to make sure you're up to date with the server install docuentation: https://docs.nextcloud.com/server/latest/admin_manual/installation/nginx.html

marcotrevisan avatar Jul 13 '24 12:07 marcotrevisan

PHP ini also includes upload_max_filesize and post_max_size, I've set them to "high values".

marcotrevisan avatar Jul 13 '24 12:07 marcotrevisan

I have a feeling that there is a timeout of 60s somewhere, but I can't find it. I've tried all the timeouts that could have been too low and increasing them, but it doesn't work. This is probably because the file is not being chunked. Because over the web interface and the Nautilus WebDav integration on my linux machine, a file upload of >1GB works just fine.

Beo-Coder avatar Jul 13 '24 14:07 Beo-Coder

I'm still having this issue, has anyone managed to find a solution yet?

felixyates avatar Dec 28 '24 15:12 felixyates

I have now implemented chunked uploads on NextcloudFileProviderKit, which should avoid hitting the memory limit and max upload size limit (the default chunk size is 100MB like on other clients).

The changes should be effective in NCFPK 2.0, which we will ship with desktop client version 3.16.0.

Thanks for the reports!

claucambra avatar Jan 23 '25 03:01 claucambra

I have now implemented chunked uploads on NextcloudFileProviderKit, which should avoid hitting the memory limit and max upload size limit (the default chunk size is 100MB like on other clients).

The changes should be effective in NCFPK 2.0, which we will ship with desktop client version 3.16.0.

Thanks for the reports!

@claucambra Thank you for the correction, awaiting the update to 3.16, have a great day!

konradbr avatar Feb 26 '25 08:02 konradbr

I'm still experiencing the issue with 3.17.2

I described the details in this post: https://help.nextcloud.com/t/error-2005-when-uploading-bigger-files/232368

drybx avatar Sep 18 '25 15:09 drybx

@drybx You should probably open a separate issue since this one has been addressed. Also more details will be needed.. Specifically: https://github.com/nextcloud/desktop/issues/5737#issuecomment-1565539136

joshtrichards avatar Sep 18 '25 19:09 joshtrichards