next-s3-upload icon indicating copy to clipboard operation
next-s3-upload copied to clipboard

Large file uploads (>4gb) not working in Safari; high memory usage for uploads in Safari

Open timgcarlson opened this issue 2 years ago • 6 comments

My team really appreciates this library, and overall it's working out great for us. The one issue we are encountering is uploading large zip files with Safari on macOS (> 4gb). All other major browsers seem to work just fine, Safari gives this error as soon as you start the upload of a large file with the uploadToS3 function:

[Error] NotReadableError: The I/O read operation failed.
image image image

Uploads from Safari (< 4gb) also seem to use a lot of system memory during the upload compared to Chrome. On Safari, the system memory usage spikes, the fans go wild (MBP pre M1 model) and I get this warning message in the browser: "This webpage is using significant memory. Closing it may improve the responsiveness of your Mac."

Any ideas on what the the issue could be here? Should files larger than 4gb work in Safari? Let me know if there is any other information I can provide to help diagnose the issue if it's not reproducible in other projects.

Thank you!

timgcarlson avatar Jan 11 '23 22:01 timgcarlson

Hey, thanks for the issue! Not sure but sounds like there could be a big in this library. Could you share your upload code?

ryanto avatar Jan 12 '23 14:01 ryanto

Sure, here's the code relating to the upload. Let me know if there is anything else that could help.

// components/Upload.tsx

const { files, resetFiles, uploadToS3 } = useS3Upload({
  endpoint: '/api/appUpload',
});

const onUpload = async () => {
  try {
    await uploadToS3(selectedFile, {
      endpoint: {
        request: {
          body: {
            userId: user.id,
            appId,
            uploadId
          },
          headers: {},
        },
      },
    });
    } catch (error) {
      // handle error
      // This is where error.message is "[Error] NotReadableError: The I/O read operation failed." on large files in Safari
    }
  }
}
// pages/api/appUpload.ts

import { NextApiRequest } from 'next';
import { getSession } from 'next-auth/react';
import { APIRoute } from 'next-s3-upload';

export const getS3AppBuildPath = async (req: NextApiRequest) => {
  const { uploadId, userId, appId} = req.body;

  if (!userId || !appId || !uploadId) {
    throw new Error('Bad request');
  }

  const session = await getSession({ req });

  if (!session) {
    throw new Error('Not authenticated');
  }

  return `${appId}/${uploadId}/bundle.zip`;
};

export default APIRoute.configure({
  accessKeyId: process.env.S3_UPLOAD_KEY,
  secretAccessKey: process.env.S3_UPLOAD_SECRET,
  bucket: process.env.S3_APP_UPLOAD_BUCKET,
  region: process.env.S3_UPLOAD_REGION,
  async key(req: NextApiRequest) {
    return await getS3AppBuildPath(req);
  },
});

timgcarlson avatar Jan 12 '23 18:01 timgcarlson

Hmm, ok your code looks spot on.

I'll try to test out with Safari and see if I can get you an answer. Sorry you're running into this issue.

ryanto avatar Jan 19 '23 01:01 ryanto

Looks like this is a bug in lib-storage. We use lib-storage under the hood to do the upload.

  • https://github.com/aws/aws-sdk-js-v3/issues/2365
  • https://github.com/aws/aws-sdk-js-v3/issues/3986

In the first thread someone had an solution using patch package. Pretty ugly :(

I'll try to reproduce and post something in those threads.

ryanto avatar Jan 19 '23 01:01 ryanto

Also experiencing this so bumping

cjjenkinson avatar Feb 25 '23 09:02 cjjenkinson

Could maybe get away with using MultiPart Upload if the file size is over N, and the browser is Safari?

"Multipart upload allows you to upload a single object as a set of parts. Each part is a contiguous portion of the object's data. You can upload these object parts independently and in any order. If transmission of any part fails, you can retransmit that part without affecting other parts. After all parts of your object are uploaded, Amazon S3 assembles these parts and creates the object. In general, when your object size reaches 100 MB, you should consider using multipart uploads instead of uploading the object in a single operation."

https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html

ErikPlachta avatar Feb 27 '23 17:02 ErikPlachta