sharp
sharp copied to clipboard
Cannot pipe output to S3 with @aws-sdk/lib-storage.
Possible bug
Error: Premature close
at new NodeError (node:internal/errors:372:5)
at Sharp.onclose (node:internal/streams/end-of-stream:136:30)
at Sharp.emit (node:events:527:28)
at Object.<anonymous> (/usr/src/app/node_modules/sharp/lib/output.js:1144:16) {
code: 'ERR_STREAM_PREMATURE_CLOSE'
}
Is this a possible bug in a feature of sharp, unrelated to installation?
- [X] Running
npm install sharp
completes without error. - [X] Running
node -e "require('sharp')"
completes without error.
Are you using the latest version of sharp?
- [X] I am using the latest version of
sharp
as reported bynpm view sharp dist-tags.latest
.
If you are using another package which depends on a version of sharp
that is not the latest, please open an issue against that package instead.
What is the output of running npx envinfo --binaries --system --npmPackages=sharp --npmGlobalPackages=sharp
?
System:
OS: Windows 10 10.0.22000
CPU: (12) x64 AMD Ryzen 5 2600 Six-Core Processor
Memory: 1.19 GB / 15.95 GB
Binaries:
Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE
Yarn: 1.22.19 - C:\ProgramData\chocolatey\bin\yarn.CMD
npm: 8.5.5 - C:\ProgramData\chocolatey\bin\npm.CMD
npmPackages:
sharp: ^0.30.7 => 0.30.7
What are the steps to reproduce?
Use AWS NodeJS SDK v3 to stream the image output to a bucket, pipe the output to the Upload instance.
What is the expected behaviour?
It successfully uploads to S3 without error.
Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this problem
This only happens with the AWS SDK v3 + Sharp so it's hard to replicate without it.
import sharp from 'sharp';
import fs from 'fs';
import AWS from '@aws-sdk/client-s3';
import { Upload } from '@aws-sdk/lib-storage';
const resizer = sharp().resize({ fit: sharp.fit.cover, height: 800, width: 800 }).avif();
const upload = new Upload({
client: new AWS.S3({
region: 'us-west-000',
endpoint: 'https://s3.us-west-000.backblazeb2.com',
credentials: {
accessKeyId: '[REDACTED]',
secretAccessKey: '[REDACTED]',
}
}),
params: { Bucket: 'bucket-name', Key: 'test.avif', Body: fs.createReadStream('./input.png').pipe(resizer) }
});
await upload.done();
Please provide sample image(s) that help explain this problem
N/A
Removing this.emit('close');
from the finish event on the erroring line in output.js in sharp fixes it.
Please can you add some error handling here:
fs.createReadStream('./input.png').pipe(resizer)
My best guess would be that something is going wrong on the input side.
Please can you add some error handling here:
fs.createReadStream('./input.png').pipe(resizer)
My best guess would be that something is going wrong on the input side.
There doesn't seem to be any error on the input stream. I've also tried input streams from both the fs lib and the fastify-multipart library and it's the same issue.
Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE
This version of Node.js is EOL, please can you upgrade to the latest version 18 release and see if you have the same problem.
Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE
This version of Node.js is EOL, please can you upgrade to the latest version 18 release and see if you have the same problem.
This still occurs with v18.7.0
Thanks for confirming, I asked as there have been a number of stream-related changes to Node.js throughout versions 17 and 18.
Do you only see this problem on Windows? Are you able to test on e.g. Linux?
Looking at the source of the @aws-sdk/lib-storage
package, it appears to use async generators, which is a newer and I suspect less well tested for edge cases approach to reading from a Stream.
https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunks/getChunkStream.ts
The next step is to try to replicate the same problem without @aws-sdk/lib-storage
to help determine if this is a bug in sharp, the @aws-sdk/lib-storage
package or Node.js itself. I had a quick try but couldn't replicate it locally.
@BlakeB415 we have a similar flow in our app, and we used a PassThrough
stream to work around this same issue
import { PassThrough } from "stream";
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";
function doUpload(data, fileName) {
const uploadStream = new PassThrough(); // <-------- HERE
const imageStream = sharp()
.resize({
width: 1400,
height: 1400,
fit: "inside",
withoutEnlargement: true,
})
.jpeg({
force: true,
mozjpeg: true,
optimizeCoding: true,
progressive: true,
quality: 80,
});
const s3Client = new S3Client({
apiVersion: "v4",
credentials: {
accessKeyId: [REDACTED],
secretAccessKey: [REDACTED],
},
endpoint: [REDACTED],
region: "auto",
});
const upload = new Upload({
client: s3Client,
queueSize: 1,
params: {
Bucket: [REDACTED],
ContentType: "image/jpeg",
Key: [REDACTED],
Body: uploadStream, // <-------- HERE
},
});
data.pipe(imageStream).pipe(uploadStream); // <-------- HERE
const results = await upload.done();
return results;
}
we're using a newer version of the S3 client, and in the function here data
is our file stream in question
also, anecdotally, in general the S3 client behaves strangely with streams, and we've found that our two options are to 1.) pass it the file stream directly, and that's it, or 2.) give it a PassThrough and then use the PassThrough as part of another pipe (as shown in this example)
hope this helps
@BlakeB415 Were you able to make any progress with this?
@BlakeB415 Were you able to make any progress with this?
Removing this line worked for me. https://github.com/lovell/sharp/blob/main/lib/output.js#L1208
@BlakeB415 Thanks for the update, please can you try the following change to see if that fixes it also:
--- a/lib/constructor.js
+++ b/lib/constructor.js
@@ -168,7 +168,7 @@ const Sharp = function (input, options) {
if (!(this instanceof Sharp)) {
return new Sharp(input, options);
}
- stream.Duplex.call(this);
+ stream.Duplex.call(this, { emitClose: false });
this.options = {
// resize options
topOffsetPre: -1,
I think this might relate to https://github.com/nodejs/node/issues/32965 - it looks like we need to notify Node.js that sharp will emit the close
event itself.
@BlakeB415 Thanks for the update, please can you try the following change to see if that fixes it also:
--- a/lib/constructor.js +++ b/lib/constructor.js @@ -168,7 +168,7 @@ const Sharp = function (input, options) { if (!(this instanceof Sharp)) { return new Sharp(input, options); } - stream.Duplex.call(this); + stream.Duplex.call(this, { emitClose: false }); this.options = { // resize options topOffsetPre: -1,
I think this might relate to nodejs/node#32965 - it looks like we need to notify Node.js that sharp will emit the
close
event itself.
I still get the error with this patch.
@BlakeB415 Thanks for checking. Here's an alternative approach, which ensures the close
event waits for the end
event. If you're able to test it with your S3 code that would be great.
- this.emit('close');
+ this.on('end', () => this.emit('close'));
@BlakeB415 Thanks for checking. Here's an alternative approach, which ensures the
close
event waits for theend
event. If you're able to test it with your S3 code that would be great.- this.emit('close'); + this.on('end', () => this.emit('close'));
This worked flawlessly!
Thanks for confirming, updated via commit https://github.com/lovell/sharp/commit/70e6bb016271ae943d1f7695de8cef7559ffbb1a and this will be in v0.31.1.
v0.31.1 now available