s3-plugin-webpack icon indicating copy to clipboard operation
s3-plugin-webpack copied to clipboard

Sync / Clear bucket before uploading?

Open andrewmartin opened this issue 7 years ago • 17 comments

I have a really quick question; is there an option to sync the assets up, e.g. clean the S3 bucket before I upload? Just curious how you might handle asset revving to clear out the old assets to keep the bucket clean. Sorry if I missed something obvious, thanks again for your wonderful work on this. Using it in production with great success!

Cheers.

Please complete these steps and check these boxes (by putting an x inside the brackets) before filing your issue:

  • [ x ] I have read and understood this plugin's README
  • [ x ] If filing a bug report, I have included my version of node and s3-plugin-webpack
  • [ x ] If filing a bug report, I have included which OS (including specific OS version) I am using.
  • [ x ] If filing a bug report, I have included a minimal test case that reproduces my issue.
  • [ x ] I understand this is an open-source project staffed by someone with a job and that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • [ x ] I understand my issue may be closed if it becomes obvious I didn't actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

andrewmartin avatar Jun 17 '17 05:06 andrewmartin

Honestly when releasing for prod you usually want to use a hash on the end of your filenames to cache bust. I'm fairly certain this will override the old files though

MikaAK avatar Jun 17 '17 07:06 MikaAK

Right, so I am actually using hashing to invalidate caches… Actually, that’s the issue. Since I’m generating unique filenames each time, it’s basically just adding more and more files to my S3 bucket. I can of course go through and manually clear them out from time to time and just rebuild, but just curious if there was an option to make it more automated.

On Jun 17, 2017, at 12:30 AM, Mika Kalathil [email protected] wrote:

Honestly when releasing for prod you usually want to use a hash on the end of your filenames to cache bust. I'm fairly certain this will override the old files though

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/MikaAK/s3-plugin-webpack/issues/74#issuecomment-309199365, or mute the thread https://github.com/notifications/unsubscribe-auth/AAQtslzh_KuIKNSlJTz-s-d4Y7h8K-3Gks5sE4COgaJpZM4N9IRK.

andrewmartin avatar Jun 17 '17 08:06 andrewmartin

Sadly no, though I would totally accept a pr with one since I could see this being useful! I can of course add it later on too

MikaAK avatar Jun 17 '17 18:06 MikaAK

Cool! I'll look into it. I know someone has a sync plugin out there.

On Jun 17, 2017, at 14:16, Mika Kalathil [email protected] wrote:

Sadly no, though I would totally accept a pr with one since I could see this being useful! I can of course add it later on too

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

andrewmartin avatar Jun 17 '17 21:06 andrewmartin

@andrewmartin Out of curiosity, did you ever come up with a solution? Dealing with this exact same issue.

wootencl avatar Nov 17 '17 17:11 wootencl

@wootencl I got sidetracked on it, ended up leaving it for now. Would love to hear if you / anyone came up with a solution...

andrewmartin avatar Nov 17 '17 18:11 andrewmartin

Does someone has a solution for this issue?

gertjanwytynck avatar Feb 21 '18 18:02 gertjanwytynck

Nothing yet, PR's welcome!

MikaAK avatar Aug 20 '18 00:08 MikaAK

A napkin note to whoever decides to roll up sleeves for this: in order to achieve zero downtime, deletion of the old/obsolete S3 objects would need to happen after the upload of the new ones, not before.

Birowsky avatar Oct 30 '18 10:10 Birowsky

Made an attempt on this here and seems to work just fine.

A napkin note to whoever decides to roll up sleeves for this: in order to achieve zero downtime, deletion of the old/obsolete S3 objects would need to happen after the upload of the new ones, not before.

Spot on. We are using Circle CI so this is dockerized and just runs after the deployment completes to ensure no downtime. Let me know if you guys are interested, and I can move this code inside the plugin with a config (default to false):

s3Options: {
   cleanAfterDeployment: true
}

dejanvasic85 avatar Dec 12 '18 23:12 dejanvasic85

@dejanvasic85 I would looooove to have this feature in the plugin!

Birowsky avatar Dec 12 '18 23:12 Birowsky

@dejanvasic85 A PR is welcome!

MikaAK avatar Dec 13 '18 06:12 MikaAK

2 years later, still no PR?

Turbotailz avatar Dec 07 '20 04:12 Turbotailz

Any news on this?

coopersamuel avatar Dec 09 '20 01:12 coopersamuel

This would be nice!

bfagundez avatar Dec 09 '20 01:12 bfagundez

2 years later, still no PR?

Maybe a good opportunity to do some OSS work 😉

MikaAK avatar Dec 15 '20 23:12 MikaAK

Hey there, I came across needing to use this tool again. Got a branch to cater for a sync option but when I push the branch I am getting access denied @MikaAK.

dejanvasic85 avatar Apr 02 '22 08:04 dejanvasic85