dropbox-sdk-js
dropbox-sdk-js copied to clipboard
Streams
Is there a way to upload files via stream?
I am using Express with Multer middleware.
/* Upload file */
dropbox.filesUpload({path, contents: file}) // <- I would like to stream the file instead loading it in memory
.then((response) => {
return next(response);
})
.catch((error) => {
debug(error);
return next(err);
});
No, unfortunately I believe filesUpload will only work with a string or buffer, but I'll be sure to pass this along as a feature request.
@greg-db Thank you!
I have looked under the hood, and the library uses 'superagent' package which should support it.
apiRequest = request.post(getBaseURL(host) + path)
.type('application/octet-stream')
.set('Authorization', 'Bearer ' + accessToken)
.set('Dropbox-API-Arg', httpHeaderSafeJson(args));
See the docs (http://visionmedia.github.io/superagent/#piping-data):
const request = require('superagent');
const fs = require('fs');
const stream = fs.createReadStream('path/to/my.json');
const req = request.post('/somewhere');
req.type('json');
stream.pipe(req);
Just to be absolutely clear on this, stream support is also required for filesDownload(). At least with upload you have the potential to use multi-part upload to make large files work. With download there is literally no way to handle a large file with the current API.
Thanks for the feedback!
Hey,
I needed streaming upload support for dropbox, so I wrote a library for it https://github.com/kksharma1618/dropbox-streaming-upload
Seems to be working fine for me so far.
@kksharma1618 I can't see the point of having stream support if we can't pipe:
fs.createReadStream('/path/to/file')
.pipe(upload(options))
@paulodiovani Pipe won't work here because we have to divide the stream's data into different chunk upload (/upload_session/append_v2).
You can just as easily do this:
import upload from 'dropbox-streaming-upload'
upload({
access_token: '',
readable_stream: fs.createReadStream('/path/to/file'),
file_size: fileSize,
destination: '/destination/path',
}).then(function(successMetadata) {
}, function(error) {
})
upload function will pipe your stream internally. If it's basic upload then stream is piped directly to request. If it's a chunked upload, then it will split the stream into smaller streams, and pipe each of them to /upload_session/append_v2
What is the current status here? Can we stream uploads or not, the discussion here seems somewhat conflicting
I don't have an update on this right now.
(To clarify though, some of the discussion here was about the capabilities of an underlying library used by the SDK, as well as an third party library written to support the functionality, not the SDK itself.)
We would also love support for readable and writeable streams!