Connection reset by peer
Having no success trying to push files anything over 60GB or so in size - I've been trying various methods and get the following when using a piped push. This may be related to #631 where the uploaded file is truncated? I'm on a reliable 100/100 fibre link and have an unlimited drive quota.
cat file1.tar.gz.enc | drive push --piped --force Backups/file1.tar.gz.enc
/Backups/file1.tar.gz.enc: Post https://www.googleapis.com/upload/drive/v2/files/0B8SSm5JP-_lTZUxJZDNJb243OTA?alt=json&setModifiedDate=true&uploadType=resumable&upload_id=AEnB2UoyNWK7Y0ai_yFB52e1TAoRrLExqb8GiIr5KFfEkVEzRX5sjVmyL9bWpVxbhlU5T29rtjjUeCH2ewE0c6dbJ8NXo-7ShQ: read tcp 192.168.0.50:46950->216.58.199.74:443: read: connection reset by peer
Post https://www.googleapis.com/upload/drive/v2/files/0B8SSm5JP-_lTZUxJZDNJb243OTA?alt=json&setModifiedDate=true&uploadType=resumable&upload_id=AEnB2UoyNWK7Y0ai_yFB52e1TAoRrLExqb8GiIr5KFfEkVEzRX5sjVmyL9bWpVxbhlU5T29rtjjUeCH2ewE0c6dbJ8NXo-7ShQ: read tcp 192.168.0.50:46950->216.58.199.74:443: read: connection reset by peer
... and that's all she wrote.
Hope you can help!
drive version: 0.3.5
Commit Hash: 'ba595abc3d8919da7873e12f7ad04e45ffb4d04f'
Go Version: go1.5.1
OS: linux/amd64
BuildTime: 2016-05-07 08:51:33.335229375 +1200 NZST
Using sshfs? The most hits I can get from Googling such errors is with sshfs. I have no clues right now of what is going on as I've never encountered this error directly with drive but if you could provide some more information about your setup, maybe that could be helpful.
No - not using sshfs. I think I have a fairly typical setup - I'm on an Ubuntu 15.10 desktop machine at the moment, and I've been using drive happily for the last six months or so. The only difference now is that I'm trying to push much larger files (trying to take advantage of the unlimited quota to keep copies of our backups in Google Drive).
Oh okay, I think I have a hunch on what is going on. How about you try just the normal push because reading from a pipe means that the size firstly isn't known so if any connection is aborted and some bytes were transferred then an erraneous ACK is sent, that file will be registered. Also you already have the file locally so drive push should suit the purpose. If you do a normal push, the size of the file is known since it is stat'd before hand so any failures will prevent registering the file as completed.
Thanks
I have been trying a normal push too - which is when I get the issue in #631.
These uploads do take a long time (12hrs or more), so I'm wondering if there is perhaps a URL expiry issue?
URL expiry actually sounds very plausible IMO. Maybe there is a configuration for us to get a non expirable URL? I used to backup my entire computers' fs's(180GB, 50GB and smaller sizes) to Google Drive and it would go on successfully for about 8 hours. However your case is different because this is one file that is 150GB and you even mentioned that you've got some reliable fast internet. I could ask people who work on Google Drive whether this is a bug with their system because IMO a file that isn't successfully transferred should be totally discarded on Google Drive. When I am free, I can try to reproduce this issue to test it. Basically make a proxy for drive connections then cause a connection reset in the proxy then see if that file is uploaded. Sorry that for now I can't be of much help until I understand what exactly is failing.
So just as a mental note to self, big file uploads taking forever receive ECONNRESET "connection reset by peer" from Google Drive's remote end and files are partially uploaded. Might be some TLS error, socket hang up.
Thanks a lot for looking into this. I'll have a go with rclone meanwhile and see if I get any further with that - will let you know how I get on.
Aye aye, hope that can fit your use case. Thanks for the feedback.
Hi all, I also have the same problem when trying to upload larger folder > 1Tb to my drive.. any updates on this issue?