oc
oc copied to clipboard
OC publishing fails when using AWS S3 bucket for large number of files
Who is the bug affecting?
- Component Creators
What is affected by this bug?
- Components publishing fails
When does this occur?
- When publishing a component to oc
What platform does this occur?
- This occurs when using AWS S3 for storing components
How do we replicate the issue?
- Try to upload/publish components having more than 300 files.
Expected behavior (i.e. solution)
- It should have published successfully to oc.
What version of OC, Node.js and OS are you using?
- [email protected]
- Node 16.X.X,
- Linux
Other Comments
- Upon investigating, found that oc-s3-storage-adapter putDir method is using new s3Client for each file and might be hitting the some limit of getting valid token for s3 when using IAM roles.
Proposed fix
- Use a single s3Client when publishing/uploading a folder
- Please find a patch version of proposed solution
@@ -130,22 +138,26 @@
const paths = await getPaths(dirInput);
const packageJsonFile = path_1.default.join(dirInput, 'package.json');
const files = paths.files.filter(file => file !== packageJsonFile);
+ const s3Client = getClient();
const filesResults = await Promise.all(files.map((file) => {
const relativeFile = file.slice(dirInput.length);
const url = (dirOutput + relativeFile).replace(/\\/g, '/');
const serverPattern = /(\\|\/)server\.js/;
const dotFilePattern = /(\\|\/)\..+/;
const privateFilePatterns = [serverPattern, dotFilePattern];
- return putFile(file, url, privateFilePatterns.some(r => r.test(relativeFile)));
+ return putFile(file, url, privateFilePatterns.some(r => r.test(relativeFile)), s3Client);
}));
// Ensuring package.json is uploaded last so we can verify that a component
// was properly uploaded by checking if package.json exists
- const packageJsonFileResult = await putFile(packageJsonFile, `${dirOutput}/package.json`.replace(/\\/g, '/'), false);
+ const packageJsonFileResult = await putFile(packageJsonFile, `${dirOutput}/package.json`.replace(/\\/g, '/'), false, s3Client);
+ console.log("Uploaded package ", dirInput);
return [...filesResults, packageJsonFileResult];
};
- const putFileContent = async (fileContent, fileName, isPrivate) => {
+ const putFileContent = async (fileContent, fileName, isPrivate, s3Client) => {
const fileInfo = (0, oc_storage_adapters_utils_1.getFileInfo)(fileName);
- return getClient().putObject({
+ console.log('Uploading, ', fileName);
+ let s3ClientLocal = s3Client ? s3Client : getClient();
+ return s3ClientLocal.putObject({
Bucket: bucket,
Key: fileName,
Body: fileContent,
@@ -156,9 +168,9 @@
Expires: (0, oc_storage_adapters_utils_1.getNextYear)()
});
};
- const putFile = (filePath, fileName, isPrivate) => {
+ const putFile = (filePath, fileName, isPrivate, s3Client) => {
const stream = fs_extra_1.default.createReadStream(filePath);
- return putFileContent(stream, fileName, isPrivate);
+ return putFileContent(stream, fileName, isPrivate, s3Client);
};
return {
getFile,
Makes sense to me. We should probably remove this "new client for every request" logic on all storage adapters, and just have a single one at the module scope
Hi @ricardo-devis-agullo , PR has been raised and is up for review
Hey @ricardo-devis-agullo Any updates on this?
Saying that, there is also need to release latest storage adapters
Adapters PR merged and published. Upgraded S3 package on oc 0.49.54. Thanks!