kaput-cli icon indicating copy to clipboard operation
kaput-cli copied to clipboard

Issues with large files not uploading correctly

Open sammylupt opened this issue 1 year ago • 2 comments

Hi,

First off, thank you so much for your work on this crate! I have been looking for something like this for a while, and I really appreciate it.

I ran into an issue where I am unable to upload large files. (The file in question is 17 gb). After running the relevant line from my shell, the curl status is immediately printed (with "0%") and Kaput immediately prints out "upload finished!"

$ kaput files upload small.mp4  # this works fine                                  
Uploading: small.mp4

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 9230k  100   480  100 9229k     72  1394k  0:00:06  0:00:06 --:--:-- 1232k

Upload finished!

$ kaput files upload large.mp4 # this does not work
Uploading: large.mp4

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0 17.8G  100   176    0     0    520      0 --:--:-- --:--:-- --:--:--   519

Upload finished!

I did some googling and it seems that Curl can act up when uploading something over 4gb, unless provided with certain headers

Using kaput debug, I grabbed my API key and tried to manually upload the file to put.

When I used the --data-binary option, curl errored with "curl: option --data-binary: out of memory"

When I used the -X POST -T option, put (ngnix) immediately responded with a 413 Request Entity Too Large. I couldn't find any information in Put's API documents about a maximum file size, but apparently there is one 😄.

I imagine that the fix to this behavior would be to run the curl command with -X POST -T, wait for the output, and only call println!("Upload finished!") if the file was uploaded successfully.

Once again thank you so much for building this tool!

sammylupt avatar Jul 16 '24 17:07 sammylupt

Thanks for reporting this and letting me know what you've already tried!

I'll look into this when I can.

@altaywtf are you able to confirm the file size limit for uploads?

davidchalifoux avatar Jul 16 '24 19:07 davidchalifoux

Hello, for regular uploads the size limit is 1M. This endpoint is mainly for small files, like torrents or subtitles.

For larger files, we use tus. It's not currently documented but the protocol is open and you can track requests in our web app.

You can also look at these to see how it's implemented:

  • https://github.com/putdotio/go-putio/blob/master/upload.go
  • https://github.com/cenkalti/tus.py/blob/master/tus.py
  • https://github.com/rclone/rclone/blob/master/backend/putio/fs.go

berkanteber avatar Jul 17 '24 16:07 berkanteber

Support for large file uploads is now available in v2.5.0! Please give it a try and let me know if it works for you.

Using the upload command on a file >= 50MB will automatically switch to using the resumable file upload protocol. Under the hood, Kaput will store a record of the upload location using the file's path and last modified time as the key in order to facilitate resumable uploads. If you ever need to clear that cached location, change either the path or modified time of the file. Otherwise, your OS should clear it after a reboot.

davidchalifoux avatar Nov 08 '24 04:11 davidchalifoux