aws-sdk-js-v3 icon indicating copy to clipboard operation
aws-sdk-js-v3 copied to clipboard

AWS non-retryable streaming request error

Open gauthampkrishnan opened this issue 1 year ago • 8 comments

Checkboxes for prior research

Describe the bug

I am trying to upload a file using aws putObject but with body as createReadStream and using retryMode as standard but on retry it throw error non-retryable streaming request error.

Regression Issue

  • [ ] Select this option if this issue appears to be a regression.

SDK version number

@aws-sdk/package-name@version, ...

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

18.20.4

Reproduction Steps

try to upload a file using body as stream and try to replicate a failure while uploading file

Observed Behavior

non-retryable streaming request error

Expected Behavior

I think it shouldnt throw error and it should retry

Possible Solution

No response

Additional Information/Context

No response

gauthampkrishnan avatar Jan 02 '25 20:01 gauthampkrishnan

That is normal. The request reads the stream but cannot rewind it if it fails.

kuhe avatar Jan 08 '25 15:01 kuhe

So what is the solution to do retries while using streams as body ? are there any solution for this scenario ?

gauthampkrishnan avatar Jan 08 '25 15:01 gauthampkrishnan

Would using Upload() method from lib-storage helps in this case ?https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/Package/-aws-sdk-lib-storage/ as i am guessing this would buffer the parts and buffered parts would be retried depending on the client retry config?

gauthampkrishnan avatar Jan 08 '25 15:01 gauthampkrishnan

You can buffer the stream or

let attempts = 3;
let stream = ... ;
while (--attempts) {
  try {
    await s3.putObject({ Body: stream });
  break; 
  } catch (e) {
    stream = ... ; // rewind/reacquire
  }
}

kuhe avatar Jan 09 '25 15:01 kuhe

@gauthampkrishnan - Can you share the code you are working on so that we can better assist you? Here's docs on SDK retry strategies and behavior for your reference: https://github.com/aws/aws-sdk-js-v3/blob/main/supplemental-docs/CLIENTS.md#retry-strategy-retrystrategy-retrymode-maxattempts

aBurmeseDev avatar Jan 09 '25 19:01 aBurmeseDev

@aBurmeseDev the code is simple, example

const stream = createReadStream(filePath)
try {
    await s3.putObject({ Body: stream });
  } catch (e) {
   console.log(e)
  }

s3 uses nodehttphandler with retry settings

and i am using the nodeHttpHandler by passing in maxRetries so it retries for buffer, but for stream it will throw error but based on above conversation looks like that is expected and need to implement a custom function with error handler inorder to retry for body which uses stream. I think the Upload() function from library lib-storage does the same under the hood it reads stream and creates buffered parts and I also believe the buffered parts would be retried on fail so I guess I will use same and maybe this issue can be closed.

gauthampkrishnan avatar Jan 09 '25 19:01 gauthampkrishnan

In definition, Upload() allows for efficient uploading of buffers, blobs, or streams, using a configurable amount of concurrency to perform multipart uploads where possible. This enables uploading large files or streams of unknown size due to the use of multipart uploads under the hood. Here's some code examples for Upload() if you might find it helpful: https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage/example-code

Let us know if you need further support.

aBurmeseDev avatar Jan 13 '25 23:01 aBurmeseDev

Ya so my doubt is if I pass in a retry config with maxRetries to s3 client and try to use the Upload() https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage/example-code and as its using multipart file upload under the hood will it do the retry of parts if it fails in between? for example lets say i am using stream and upload() function so upload() will divide the data into parts and uploads to bucket lets say one of the part failed in that case would it retry uploading that part?

gauthampkrishnan avatar Jan 14 '25 13:01 gauthampkrishnan

I'm also seeing this on practically every stream I try to upload from a fetch response, see https://github.com/aws/aws-sdk-js-v3/issues/7048#issuecomment-3150288704. I think priority 3 might be a little low here @aBurmeseDev, especially as no additional debugging info is provided and it happens so consistently 🤔

marcesengel avatar Aug 04 '25 11:08 marcesengel

somewhat related issue was autoclosed by the github bot: https://github.com/aws/aws-sdk-js-v3/issues/7048

@kuhe can that please be reopened? It's a pretty serious one.

jpike88 avatar Oct 16 '25 07:10 jpike88

#7048 is discussing too many separate issues. Create a new issue for your case.

The separate issues discussed there and here are:

chunk size

"Only the last chunk is allowed to have a size less than 8192 bytes" -> partition your stream in larger chunks or use the SDK's built in stream buffering feature.

new S3Client({
  requestStreamBufferSize: 64 * 1024,
});

non-retryable streaming request

"An error was encountered in a non-retryable streaming request."

This phrase is from a warning message that is logged prior to throwing an error. The actual root error is thrown and you would have to provide that for more information.

"Unable to calculate hash for flowing readable stream".

Don't mix web streams and the Node.js https module. The suggested adapter was:

const nodeStream = Readable.fromWeb(res.body as ReadableWebStream)

If this warns "An error was encountered in a non-retryable streaming request.", again the thrown error must be given for further investigation.

this issue

This still falls under the "non-retryable streaming request" case. What was the thrown error encountered after that warning? That warning is not the error.

kuhe avatar Oct 16 '25 18:10 kuhe

This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing.

github-actions[bot] avatar Oct 27 '25 00:10 github-actions[bot]

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

github-actions[bot] avatar Nov 15 '25 00:11 github-actions[bot]