s3-upload-stream
s3-upload-stream copied to clipboard
`finish` event fired twice
var fs = require('fs');
var AWS = require('aws-sdk');
var s3Stream = require('s3-upload-stream')(new AWS.S3());
var read = fs.createReadStream('path/to/my/file.json');
var upload = s3Stream.upload({
Bucket: "my-bucket",
Key: "key.txt",
ACL: "public-read",
StorageClass: "REDUCED_REDUNDANCY"
});
upload.on('error', function (err) {
console.log('An error occured:', err);
});
upload.on('finish', function (result) {
console.log('Finish:', result);
});
upload.on('uploaded', function (result) {
console.log('Uploaded:', result);
});
read.pipe(upload);
And here is the output:
Finish: undefined
Uploaded: { Location: 'https://my-bucket.s3.amazonaws.com/key.txt',
Bucket: 'my-bucket',
Key: 'key.txt',
ETag: '"<etag>"' }
Finish: { Location: 'https://my-bucket.s3.amazonaws.com/key.txt',
Bucket: 'my-bucket',
Key: 'key.txt',
ETag: '"<etag>"' }
I have created branch 1.1.0 with a fix for the issue. I have a bit more testing to do but if it checks out I should be publishing on NPM shortly.
just tried this branch. the finish event fires indeed only once, but at the wrong moment: when it has finished reading the input stream, instead of when it has finished writing (ie:completed the upload).
My colleague and I fell foul of this the other day on the latest published npm version. We ended up just using uploaded as a work-around. /cc @BorePlusPlus
Any update on this one? Not being able to rely on 'finish' (which in 1.0.7 still get fired twice) makes it hard to use with other stream utilities (e.g multipipe / pumpify who relies on end/finish).
Seems https://github.com/nathanpeck/s3-upload-stream/pull/40 might be fixing this
This module is deprecated as its functionality has been added to the official aws-sdk module from AWS themselves. I recommend that you migrate to use the official library.
Makes sense, thanks!