aws-sdk-js-v3
aws-sdk-js-v3 copied to clipboard
Process dies silently when there are many open connections to S3
Describe the bug
I was trying to create a zip file for a bunch of S3 objects. When using the SDK v3 the process dies silently with no error (even the exit code is 0). If I use the SDK v2 everything works file. I'm attaching the code for both v3 or v2.
Your environment
Ubuntu 20
SDK version number
3.44.0
Is the issue in the browser/Node.js/ReactNative?
Node.js
Details of the browser/Node.js/ReactNative version
v16.10.0
Steps to reproduce
import archiver = require('archiver');
import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';
import { Upload } from '@aws-sdk/lib-storage';
import { Readable, Stream } from 'stream';
const BUCKET = process.env.BUCKET!;
const s3Client = new S3Client({});
async function openS3ObjectStream(key: string): Promise<Readable> {
const { Body } = await s3Client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: key,
}));
return Body as Readable;
}
async function main() {
const streamPassThrough = new Stream.PassThrough();
const s3Upload = new Upload({
client: s3Client,
params: {
Bucket: BUCKET,
Key: 'my-archive.zip',
Body: streamPassThrough,
},
});
const zip = archiver('zip');
zip.on('warning', console.log);
zip.on('error', console.error);
zip.pipe(streamPassThrough);
console.log('appending files');
for (let i = 100; i < 200; i++) {
let stream = await openS3ObjectStream(`${i}.dat`);
zip.append(stream, { name: `${i}.dat` });
}
console.log('waiting zip completion');
await zip.finalize();
console.log('waiting upload completion');
await s3Upload.done();
return 'done!';
}
main().catch(console.error).then(console.log);
My shell output:
$ BUCKET=bucketName npx ts-node compressor.ts
appending files
$ echo $?
0
If I use the aws-sdk v2 instead:
import archiver = require('archiver');
import * as AWS from 'aws-sdk';
import { Readable, Stream } from 'stream';
const BUCKET = process.env.BUCKET!;
const s3Client = new AWS.S3();
async function openS3ObjectStream(key: string): Promise<Readable> {
return s3Client.getObject({
Bucket: BUCKET,
Key: key,
}).createReadStream();
}
async function main() {
const streamPassThrough = new Stream.PassThrough();
const s3Upload = s3Client.upload({
Bucket: BUCKET,
Key: 'my-archive.zip',
Body: streamPassThrough,
});
const zip = archiver('zip');
zip.on('warning', console.log);
zip.on('error', console.error);
zip.pipe(streamPassThrough);
console.log('appending files');
for (let i = 100; i < 200; i++) {
let stream = await openS3ObjectStream(`${i}.dat`);
zip.append(stream, { name: `${i}.dat` });
}
console.log('waiting zip completion');
await zip.finalize();
console.log('waiting upload completion');
await s3Upload;
return 'done!';
}
main().catch(console.error).then(console.log);
everything works perfectly
Observed behavior
Apparently the node process dies silently if too many connections to s3 are open
Expected behavior
The process should proceed or give at least an error to outline the underlying cause/solution
Seem to have the same behavior with Kinesis Client and PutRecords.
Maybe not related 🤷♂️ But I downgraded the version of @aws-sdk and it's works.
Seem to have the same behavior with Kinesis Client and PutRecords.
Maybe not related 🤷♂️ But I downgraded the version of @aws-sdk and it's works.
I'm seeing that too with the Kinesis client.
Version 3.128.0 gives me: InvalidSignatureException: 'Host' or ':authority' must be a 'SignedHeader' in the AWS Authorization. Version 3.121.0 silently exits after sending a GetRecordsCommand() Version 3.39.0 works as before
I have an issue similar where the upload of a file of around 40m is failing without any message. I am using node 14 and the sdk v3.
I have been able to make the upload working by making queueSize: 1
:
const command = new Upload({
client,
params: {
Bucket,
Key: filePath,
Body: body,
ContentEncoding: 'gzip',
},
queueSize: 1,
})
In fact I still have the issue time to time with queueSize:1
, i tried with the sdk v2 and it is much much faster and reliable.
Hey everyone, apologies I wasnt able to take the issue up earlier and it took this much time,
Running the code mentioned above it seems to be working fine with the latest version of the SDK. 3.182.0
The output from above code:
Output
appending files
opening 100.dat
opening 101.dat
opening 102.dat
opening 103.dat
opening 104.dat
opening 105.dat
opening 106.dat
opening 107.dat
opening 108.dat
opening 109.dat
opening 110.dat
opening 111.dat
opening 112.dat
opening 113.dat
opening 114.dat
opening 115.dat
opening 116.dat
opening 117.dat
opening 118.dat
opening 119.dat
opening 120.dat
opening 121.dat
opening 122.dat
opening 123.dat
opening 124.dat
opening 125.dat
opening 126.dat
opening 127.dat
opening 128.dat
opening 129.dat
opening 130.dat
opening 131.dat
opening 132.dat
opening 133.dat
opening 134.dat
opening 135.dat
opening 136.dat
opening 137.dat
opening 138.dat
opening 139.dat
opening 140.dat
opening 141.dat
opening 142.dat
opening 143.dat
opening 144.dat
opening 145.dat
opening 146.dat
opening 147.dat
opening 148.dat
opening 149.dat
opening 150.dat
opening 151.dat
opening 152.dat
opening 153.dat
opening 154.dat
opening 155.dat
opening 156.dat
opening 157.dat
opening 158.dat
opening 159.dat
opening 160.dat
opening 161.dat
opening 162.dat
opening 163.dat
opening 164.dat
opening 165.dat
opening 166.dat
opening 167.dat
opening 168.dat
opening 169.dat
opening 170.dat
opening 171.dat
opening 172.dat
opening 173.dat
opening 174.dat
opening 175.dat
opening 176.dat
opening 177.dat
opening 178.dat
opening 179.dat
opening 180.dat
opening 181.dat
opening 182.dat
opening 183.dat
opening 184.dat
opening 185.dat
opening 186.dat
opening 187.dat
opening 188.dat
opening 189.dat
opening 190.dat
opening 191.dat
opening 192.dat
opening 193.dat
opening 194.dat
opening 195.dat
opening 196.dat
opening 197.dat
opening 198.dat
opening 199.dat
waiting zip completion
In fact I have the same issue with the sdk v2, I thought it was working at first because I was not sending any data .... (not calling send()
function.
This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing.
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.