aws-sdk-js-v3
aws-sdk-js-v3 copied to clipboard
"RangeError: Out of Memory" for large file uploads on iOS devices
Pre-Migration Checklist
- [X] I've read the Migration Guide.
- [X] I've reviewed the upgrading notes and major version differences mentioned in
UPGRADING.md. - [X] I've checked AWS Forums and StackOverflow for similar migration issues.
Which JavaScript Runtime is this issue in?
Browser
AWS Lambda Usage
- [ ] Yes, my application is running on AWS Lambda.
- [ ] No, my application is not running on AWS Lambda.
Describe the Migration Issue
On V3, whenever I try uploading a large file on an iOS device the AWS upload fails with the error "Range Error: Out of Memory". The entire file is loaded into memory before being split into chunks.
MacOS does the same thing, but having a lot more memory the error doesn’t happen.
For Windows works fine, same for Android devices.
On V2 this issue does not happen.
chunker.js
import { Buffer } from "buffer";
import { Readable } from "stream";
import { getChunkStream } from "./chunks/getChunkStream";
import { getChunkUint8Array } from "./chunks/getChunkUint8Array";
import { getDataReadable } from "./chunks/getDataReadable";
import { getDataReadableStream } from "./chunks/getDataReadableStream";
export const getChunk = (data, partSize) => {
if (data instanceof Uint8Array) {
return getChunkUint8Array(data, partSize);
}
if (data instanceof Readable) {
return getChunkStream(data, partSize, getDataReadable);
}
if (data instanceof String || typeof data === "string") {
return getChunkUint8Array(Buffer.from(data), partSize);
}
if (typeof data.stream === "function") {
return getChunkStream(data.stream(), partSize, getDataReadableStream);
}
if (data instanceof ReadableStream) {
return getChunkStream(data, partSize, getDataReadableStream);
}
throw new Error("Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob;.");
};
The issue seems to happen here, on data.stream(), it behaves differently on safari. Makes a blob request which loads in memory and on iPhone/iPad causes page to crash or just fails to chunkify and upload.
if (typeof data.stream === "function") {
return getChunkStream(data.stream(), partSize, getDataReadableStream);
}
Code Comparison
V3 Code:
uploadMedia(trackProgress: (progress: number) => void): Observable<any> {
this.currentUpload = new Upload({
client: this.generateClient(),
params: this.generateClientCommand().input,
})
this.currentUpload.on('httpUploadProgress', (progress) => {
trackProgress(Math.round(progress.loaded / progress.total * 100))
});
return from(this.currentUpload.done())
}
private generateClient(): S3Client {
return new S3Client({
region: ENV.s3.region,
credentials: this.credentials,
useAccelerateEndpoint: ENV.s3.useAccelerated,
requestHandler: new FetchHttpHandler({
requestTimeout: 0,
}),
maxAttempts: 10000
});
}
private generateClientCommand(): PutObjectCommand {
return new PutObjectCommand({
Bucket: ENV.s3.bucket,
Key: this.generateBucketKey(),
Body: this.file,
ACL: 'private',
ContentType: this.file.type
});
}
V2 Code:
const options: Types.ClientConfiguration = {
...this.userService.adtKeys,
region: ENV.s3.region,
correctClockSkew: true,
maxRetries: 100000
};
if (ENV.s3.useAccelerated) {
options.useAccelerateEndpoint = true;
}
const bucket = new S3(options);
const params = {
Bucket: ENV.s3.bucket,
Key: this.userService.userUuid + '/' + this.$videoUploader.uploadedFileName,
Body: this.$videoUploader.file,
ACL: 'private',
ContentType: this.$videoUploader.contentType
};
AWS.config.httpOptions.timeout = 0;
this.$videoUploader.upload = bucket.upload(params);
this.$videoUploader.upload.send((err, data) => {
if (err && !this.$videoUploader.abortUpload) {
this.toastrService.error(
this.translate.fail_upload,
this.translate.network_error,
this.toastrConfig
);
this.resetVideoUploader();
return false;
}
this.$videoUploader.status = 'complete';
return true;
});
this.$videoUploader.upload.on('httpUploadProgress', (evt) => {
this.$videoUploader.progress = Math.round(evt.loaded * 100 / evt.total);
});
Observed Differences/Errors
Additional Context
Angular app. "@angular/core": "16.2.7"
AWS library version for v3 "@aws-sdk/client-s3": "^3.617.0", "@aws-sdk/lib-storage": "^3.617.0", "@aws-sdk/types": "^3.609.0", "@smithy/fetch-http-handler": "^3.2.4",
AWS library version for v2 "aws-sdk": "2.1466.0",
Hi @atataru23 - thanks for reaching out.
The error indicates that the entire file is being loaded into memory before it can be split into smaller chunks. This approach can lead to memory issues especially on devices with limited resources such as iOS devices.
The key distinction between AWS SDK for JavaScript V2 and V3 lies in how they handle large file uploads. In V2, the S3 client provides the ManagedUpload class, which includes an upload() operation that supports uploading large objects using S3's multipart upload feature. This feature allows for efficient handling of large files by splitting them into smaller chunks and uploading them in parallel.
On the other hand, in V3, the @aws-sdk/lib-storage library is introduced which provides functionality similar to the upload() operation in V2, but with additional features and support for both Node.js and browser runtimes. The lib-storage library is designed to handle large file uploads efficiently, including multipart uploads and automatic chunk splitting.
Depending on the runtimes you're using, you can either implement the Upload class from @aws-sdk/lib-storage or follow the Multipart upload by S3 documentation to handle large file uploads efficiently, while minimizing memory usage.
Hope that helps, John
Hello @aBurmeseDev
This is my implementation using the @aws-sdk/lib-storage in V3. I'll leave only the relevant parts of the code. Runtime: Browser package.json
"@angular/core": "16.2.7",
"@aws-sdk/client-s3": "^3.617.0",
"@aws-sdk/lib-storage": "^3.617.0",
"@aws-sdk/types": "^3.609.0",
"@smithy/fetch-http-handler": "^3.2.4",
Script responsible for uploading files to S3. (relevant parts only)
import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3";
import { FetchHttpHandler } from "@smithy/fetch-http-handler";
import { Upload } from "@aws-sdk/lib-storage";
import { from, Observable } from "rxjs";
export class S3ClientClass {
constructor(
) {
}
uploadMedia(trackProgress: (progress: number) => void): Observable<any> {
this.currentUpload = new Upload({
client: this.generateClient(),
params: this.generateClientCommand().input,
})
this.currentUpload.on('httpUploadProgress', (progress) => {
trackProgress(Math.round(progress.loaded / progress.total * 100))
});
return from(this.currentUpload.done())
}
private generateClient(): S3Client {
return new S3Client({
region: ENV.s3.region,
credentials: this.credentials,
useAccelerateEndpoint: ENV.s3.useAccelerated,
requestHandler: new FetchHttpHandler({
requestTimeout: 0,
}),
maxAttempts: 10000
});
}
private generateClientCommand(): PutObjectCommand {
return new PutObjectCommand({
Bucket: ENV.s3.bucket,
Key: this.generateBucketKey(),
Body: this.file,
ACL: 'private',
ContentType: contentType
});
}
}
And where I want to use the upload, I instantiate the class with the relevant data, then call the uploadMedia() method like so
this.$upload = this.s3Client.uploadMedia(
(progress: number) => this.progress = progress
)..subscribe(() => //other logic);
This implementation works as expected on Desktop and Android Devices. Large files automatically get split into chunks and get uploaded to S3.
But on iOS devices, makes a blob request which loads the entire file into memory before being split into chunks which causes the error, "Range Error: Out of Memory". The same happens for macOS, entire file loads into memory then it is split into chunks, no error though because there is enough memory.
In some cases for iOS, the browser crashes.
The error always seems to lead here in chunker.js files from aws when using Safari.
if (typeof data.stream === "function") {
return getChunkStream(data.stream(), partSize, getDataReadableStream);
}
Should I move this thread to bugs instead of migration? Or is it something I overlooked?
I appreciate your help @aBurmeseDev !
Hello, the issue described in my last comment is still ongoing, any updates on why is it happening? Let me know if I need to provide any additional info.
Apologies for the long silence here. I'm not sure if you're still working on this but I've attempted to reproduce this using our test repo and all the large files were uploaded successfully.
When you said this only occurs on iOS devices, are you running in React or Browser environment? It's strange that the same file uploads are successful in Androids.
Hello,
I haven't worked on this issue since my initial post. I continued using the V2 SDK, as the issue didn't occur there.
This is an Angular project, and the problem occurred with V3 when I tried to upload large video files (up to 10GB) in a web environment on Safari, specifically on iOS devices.