aws-sdk-js-v3
aws-sdk-js-v3 copied to clipboard
Angular 19 PutObjectRequest readableStream.getReader is not a function
Checkboxes for prior research
- [x] I've gone through Developer Guide and API reference
- [x] I've checked AWS Forums and StackOverflow.
- [x] I've searched for previous similar issues and didn't find any solution.
Describe the bug
Hi, I have an Angular application that I recently updated to version 19, and during this migration I also updated aws-sdk/client-s3 to version 3.731.1.
Before the migration the following code worked fine allowing me to upload files to S3. However, now the same code throws the exception TypeError: readableStream.getReader is not a function.
store(filename: string, data: Blob, folder: string): Promise<string> {
const weakThis = this;
return new Promise<string>(async (resolve, reject) => {
try {
const input: PutObjectRequest = {
ACL: 'public-read',
Bucket: this.awsBucket,
Key: `${folder}/${filename}`,
Body: data,
}
const command = new PutObjectCommand(input);
const resp = await weakThis.client.send(command);
if (resp.$metadata.httpStatusCode < 200 || resp.$metadata.httpStatusCode > 299) {
this.logger.error(`[AwsFileService][store] Error storing file at path ${key}`);
this.logger.error(`[AwsFileService][store] HTTP Status ${resp.$metadata.httpStatusCode}`);
reject(resp.$metadata.httpStatusCode);
return;
}
resolve(key);
} catch (error) {
debugger;
console.log(`[AwsFileService][store] Error storing file at path ${key}`);
this.logger.error(`[AwsFileService][store] Error storing file at path ${key}`);
console.log(`[AwsFileService][store] ${error}`);
this.logger.error(`[AwsFileService][store] ${error}`);
reject(error);
}
});
}
Interestingly when I run the Angular application via ng serve this code works as expected. However, when I attempt to build a production copy of the application via ng build, and then run as a standard Node JS application that the issue occurs.
Environment: Node.js: 22.12.0 Angular: 19.1.2 TypeScript: 5.7.3 aws-sdk/client-s3: 3.731.1"
Regression Issue
- [ ] Select this option if this issue appears to be a regression.
SDK version number
@aws-sdk/[email protected]
Which JavaScript Runtime is this issue in?
Node.js
Details of the browser/Node.js/ReactNative version
22.12.0
Reproduction Steps
- Create a typical Angular 19 application
- Install aws-sdk/client-s3
- Add code above in description
- Test 1: run
ng serve - Test 2: run
ng build
Observed Behavior
Test 1: File uploads successfully Test 2: Exception is thrown.
Expected Behavior
Test 1: File uploads successfully Test 2: File uploads successfully
Possible Solution
No response
Additional Information/Context
No response
UPDATE
I am thinking the original Test 1 scenario is not valid. Since posting this I changed the version of aws-sdk/client-s3 back to 3.32.0 (last known working version), and now I'm able to replicate the issue regardless of how the app is built. I have switched back to 3.731.1 and am able to replicate the issue both with ng serve and ng build
+1
I have faced this issue and reverted back my package version to 3.617.0 from ^3.617.0.
After debugging, I realized that it's dues to https://github.com/aws/aws-sdk-js-v3/releases/tag/v3.729.0 minor version update.
As it's breaking the flow, anyone please provide steps or configuration to avoid this issue.
+1
I have faced this issue and reverted back my package version to
3.617.0from^3.617.0.After debugging, I realized that it's dues to https://github.com/aws/aws-sdk-js-v3/releases/tag/v3.729.0 minor version update.
As it's breaking the flow, anyone please provide steps or configuration to avoid this issue.
Thank you for replying! While the latest version of the SDK is still broken, you pointing me to version 3.617.0 has allowed me to patch the issue so my clients can now upload photos. Short term fix for sure, but at least we can upload photos again!
The latests stable version where it works is 3.726.1
Starts failing starting release 3.729.0
I thinks could be on this commit https://github.com/aws/aws-sdk-js-v3/commit/b6204f8b03f81157efd5b6d7ee746c578dec4160
Code example putting an object to the bucket:
const bucket = new S3Client({})
const input = new PutObjectCommand({
Bucket: 'bucketName',
Key: 'key',
Body: file,
ACL: 'public-read',
ContentType: file.type
})
await bucket.send(input) // readableStream.getReader is not a function at getAwsChunkedEncodingStream
Error message:
readableStream.getReader is not a function at getAwsChunkedEncodingStream
Hi @mike-appvision - thanks for reaching out and for your patience while we look into it.
I noticed you mentioned files were uploaded successfully on Test 1 and failed on Test 2. Could you elaborate on that?
During my attempt to replicate the issue, I was able to reproduce the error in first few uploads and then it started to upload successfully. Here's my repro for reference: https://github.com/aBurmeseDev/aws-sdk-js-s3-angular
In the meantime, here are a few workarounds we'd like to suggest here:
- The root cause may be due to the recent change about default data integrity protections with S3. Although it's not recommended, as a workaround, you can disable checksum by setting client configuration.
const client = new S3({
...
requestChecksumCalculation: "WHEN_REQUIRED",
...
});
- You can convert input file types into one of these
string | Uint8Array | ReadableStreamOptionalType | BlobOptionalType. (ref)
const fileArrayBuffer = await file.arrayBuffer(); // Convert File to ArrayBuffer
const command = new PutObjectCommand({
Bucket: 'Bucket_Name',
Key: file.name,
Body: new Uint8Array(fileArrayBuffer), // Convert ArrayBuffer to Uint8Array
ContentType: file.type,
});
- ~~You may also be able to provide
polyfillfor the API being used.~~
Hope it helps! cc: @partheev @wis-dev
This is happening with file uploads, since requestBody is a File object in flexible checksums middleware.
https://github.com/aws/aws-sdk-js-v3/blob/9f9fe77b9f8e24ce2cad7e42ca58e2466bc969b2/packages/middleware-flexible-checksums/src/flexibleChecksumsMiddleware.ts#L122
The implementation expects it to be ReadableStream.
https://github.com/smithy-lang/smithy-typescript/blob/fbe3c04b5627a8aea693b5bfc1598adbac0213d5/packages/util-stream/src/getAwsChunkedEncodingStream.browser.ts#L22
As @aBurmeseDev mentioned, the simplest workaround at the time of comment is to disable checksum computation by setting the following configuration during client creation
const client = new S3({
// ... other params
// ToDo: Remove workaround once https://github.com/aws/aws-sdk-js-v3/issues/6834 is fixed.
requestChecksumCalculation: "WHEN_REQUIRED",
});
If the files are not huge, you can convert them to ArrayBuffer too. It's going to read all bytes into memory though, as per specification.
// ...
await client.putObject({
Bucket: import.meta.env.VITE_AWS_S3_BUCKET_NAME,
Key: file.name,
Body: await file.arrayBuffer(),
});
// ...
I'll raise this issue internally with the team on Mon, Jan 27th, and provide an update.
We're still looking into this issue, but no updates. Internal tracking ID is JS-5699.
I use Angular 18 and encountered the same issue. As @wis-dev said, 3.726.1 works.
This problem also occurs in React.js + Next.js environment.
An error occurs when uploading an image to S3 from the browser.
- Node.js: 20.18.2
- React.js: 18.3.1
- Next.js: 13.5.8
@aws-sdk/client-s3: It worked normally in 3.726.1, but an error occurs in later versions.
- v3.731.1 ~ 3.744.0
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3')
const { fromCognitoIdentityPool } = require('@aws-sdk/credential-provider-cognito-identity')
const { CognitoIdentityClient } = require('@aws-sdk/client-cognito-identity')
const client = new S3Client({
region,
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient({ region }),
identityPoolId: process.env.S3_CLIENT_POOL_ID
})
})
client.send(new PutObjectCommand({
Bucket,
Key,
Body,
ContentType,
Metadata
}))
Error
message: "readableStream.getReader is not a function"
stack: "TypeError: readableStream.getReader is not a function
at getAwsChunkedEncodingStream (webpack-internal:///../node_modules/@smithy/util-stream/dist-es/getAwsChunkedEncodingStream.browser.js:13:35)
at eval (webpack-internal:///../node_modules/@aws-sdk/middleware-flexible-checksums/dist-es/flexibleChecksumsMiddleware.js:83:27)
Facing this error in React 19 (Vite) as well:-
TypeError: readableStream.getReader is not a function
@aws-sdk/client-s3 version is 3.749.0
Facing similar issue in our Nestjs application as well
Following versions are being used in our application Node : 20.9.0 Nestjs - 10.2.1 @aws-sdk/client-s3 : 3.750.0
User is not able to upload zip file in s3 bucket and putObjectCommand is throwing following error
An error was encountered in a non-retryable streaming request.
InvalidChunkSizeError: Only the last chunk is allowed to have a size less than 8192 bytes)
It worked normally in 3.726.1 version of @aws-sdk/client-s3 , but an error occurs in later versions.
- v3.729.0 ~ 3.758.0
For googlers, I experienced this uploading files from Cloudflare Workers. Downgrading and pinning at 3.726.1 was enough.
Still occurring in 3.806.0.
Still occurring in 3.826.0
Still occurs for me as of 3.844.0. I did try the workaround below, but it's not ideal. When do you all think we can see an actual solution to the issue?
https://github.com/aws/aws-sdk-js-v3/issues/6834#issuecomment-2613306914
This issue still occurs in v3.855.0 as well, FYI.
This issue still occurs in v3.864.0 as well, FYI.
v3.864.0 问题依旧存在
Version "@aws-sdk/client-s3": "^3.726.1" does not work.
Angular 17.3.10
Node v24.3.0
TypeError: readableStream.getReader is not a function at getAwsChunkedEncodingStream (getAwsChunkedEncodingStream.browser.js:9:35) at flexibleChecksumsMiddleware.js:69:27 at Generator.next (<anonymous>) at asyncGeneratorStep (asyncToGenerator.js:3:1) at _next (asyncToGenerator.js:22:1) at _ZoneDelegate.invoke (zone.js:365:28) at Object.onInvoke (core.mjs:14882:33) at _ZoneDelegate.invoke (zone.js:364:34) at ZoneImpl.run (zone.js:111:43) at zone.js:2447:40