forge-api-nodejs-client icon indicating copy to clipboard operation
forge-api-nodejs-client copied to clipboard

uploadResources failing for larger models

Open amkoehler opened this issue 2 years ago • 0 comments

Hello - we're in the process of updating to use uploadResources for uploading our model files in chunks. The uploads are consistently failing for larger Revit models. In our testing we've seen this fail for models larger than 200 MB, but that's a rough estimate.

After uploadResources resolves, it comes back with a MissingOrInvalidParts error, and within the parts there's a lot of Unexpected and Pending messages.

Here's how we're calling uploadResources. I tested variations of chunkSize, maxBatches, and useAcceleration and all ended up with the same result. No issues uploading Autodesk's sample models, which aren't quite this large.

  let result; 
  try {
    result = await objectsApi.uploadResources(
      'my_model_bucket',
      [
        {
          objectKey: forgeFilePath,
          // ~200 MB model
          data: buffer,
        },
      ],
      {
        // Upload in 5MB chunks
        chunkSize: 5,
        maxBatches: 25,
        minutesExpiration: 9,
        onUploadProgress: progressEvent => {
          console.info(progressEvent);
          info(`APS bucket upload progress: ${progressEvent.progress * 100}%`);
        },
      },
      twoLeggedAuthResult.client,
      twoLeggedAuthResult.credentials,
    );
  } catch (err) {
    console.error(err);
  }
  
  /**
   * result.status: 'error',
   * result.reason: 'MissingOrInvalidParts'
   */
  console.log(result);

Any suggestions as to what could be going wrong? I went through the complex example and noticed some sha1 verification happening after uploadResources is called. That doesn't seem related since the verification is called after uploadResources resolves.

image

amkoehler avatar Sep 08 '23 16:09 amkoehler