git-annex-remote-rclone icon indicating copy to clipboard operation
git-annex-remote-rclone copied to clipboard

Add rclone global_flags config

Open unqueued opened this issue 2 years ago • 2 comments
trafficstars

Allows storing rclone global flags to be passed to all rclone operations

Re-submitting pull/71

Google Drive may need special global flags to work optimally, especially with changes to Google's API.

Related to https://github.com/git-annex-remote-rclone/git-annex-remote-rclone/issues/70

May also be helpful for https://github.com/datalad/datalad/issues/4149

I had to set rclone_flags to --no-traverse to get drive working properly. There are other potential benefits in being able to pass global flags to rclone per special remote.

I added rclone_flags to the test, but rclone_flags is optional.

unqueued avatar May 26 '23 18:05 unqueued

This code is not that good. I just threw it together awhile ago to make it functional for me. I don't remember my justification for adding the test, I think it looked like they were still a WIP and wanted to make it pass.

@yarikoptic do you know how many other users are reporting this issue?

For me, it is pretty serious. All operations with a gdrive rclone backend take longer than they should, because rclone is very chatty without --no-traverse. Dropping a single key still takes 5 minutes because rclone's enumeration and API backoffs. I saw other users reporting this on the git-annex forum.

And from what I can tell, everything still works with rclone gdrive and --no-traverse, so having it used for all requests is a benefit, and that is what I switched to doing months ago.

But the simplest solution is to just make sure users export RCLONE_NO_TRAVERSE=1 before using this special remote.

I think at least giving users the option to set persistent rclone flags for specific special remotes would be helpful for the moment.

unqueued avatar Nov 06 '23 13:11 unqueued

I personally didn't use with gdrive yet, mostly with dropbox. Either it is slow currently -- I guess may be for one use case I would like to have it faster since it is a lot of small files synced to dropbox. But the problem with automating things (as they are automated now) -- I tend not to look inside as long as it works.

edit: I guess ideally I should experiment and test how much of speed up this would give me for my usecase.

yarikoptic avatar Nov 06 '23 16:11 yarikoptic