acd_cli icon indicating copy to clipboard operation
acd_cli copied to clipboard

Limited to 100TB storage

Open zenjabba opened this issue 9 years ago • 13 comments

ACD_CLI reports to FUSE 100TB storage, so once you go above that amount, it will report "out of space".

Can we report 1PB of free space?

zenjabba avatar Aug 02 '16 21:08 zenjabba

Amazon will ask you question before you reach that I would say... It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?

charlymr avatar Aug 03 '16 10:08 charlymr

yes I am close to that amount, and amazon has not questioned my amount stored in my two accounts.

On 3 Aug 2016, at 7:12 AM, MARTIN Denis [email protected] wrote:

Amazon will ask you question before you reach that I would say... It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yadayada/acd_cli/issues/370#issuecomment-237206037, or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuCqQUaZVsYklMR1qR09n6vwCLX_6ks5qcHeVgaJpZM4JbEMX.

zenjabba avatar Aug 03 '16 11:08 zenjabba

Fairplay 👍

charlymr avatar Aug 03 '16 11:08 charlymr

I think thats the endpoint data from amazon is the quota side, if you look in .cache/endpoint_cache it says the 100TB max size

I think this might be Unlimited* on ACD's part

cyberbalsa avatar Aug 04 '16 20:08 cyberbalsa

I've noticed this as well. Could you please comment here again @zenjabba if you get past 100TB with the result?

asabla avatar Aug 05 '16 00:08 asabla

So I've looked at other systems, and they also report 100TB as the storage space available, but for example NetDrive reports

100TB of 100TB Free screen shot 2016-08-05 at 8 02 14 pm so I don't feel it's a limitation within amazon.

zenjabba avatar Aug 06 '16 00:08 zenjabba

It appears to be a 100TB soft cap, where you can call Amazon to allow for more space (probably have to give them a good reason given that 100TB is quite a bit of storage). Not certain though since I can't seem to find anyone hitting the 100TB cap. Keep us in the loop!

https://github.com/dularion/streama/issues/63

bryan avatar Aug 09 '16 07:08 bryan

If it's any help, S3QL (another cloud filesystem) reports double the usage with a minimum of 1TB, so you never exceed 50℅ of the reported space.

roaima avatar Sep 24 '16 15:09 roaima

The problem is S3QL doesn’t support ACD, so apples and oranges (last time I looked at S3QL it didn’t)

zenjabba avatar Sep 24 '16 15:09 zenjabba

Does the reported disk size really have any practical implications?

yadayada avatar Sep 24 '16 16:09 yadayada

yes, when you try to upload a file and the disk reports 100% usage, you get a error

On 24 Sep 2016, at 12:40 PM, yadayada [email protected] wrote:

Does the reported disk size really have any practical implications?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/yadayada/acd_cli/issues/370#issuecomment-249374287, or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuAS7vVFXk7zYuIfje3HIVIox4Iitks5qtVJ6gaJpZM4JbEMX.

zenjabba avatar Sep 24 '16 17:09 zenjabba

@zenjabba did you try uploading using rclone? I'm just curious to see if it presents the same limitation. I'm currently using acdcli to mount my ACD and rclone to upload to it

(sorry @yadayada for PRing the competition, it's just that I'm more familiared with the rsync syntax -- although you have some interesting options in your upload command that I still ought to try)

tristaoeast avatar Oct 22 '16 11:10 tristaoeast

So finally got a chance to get back to this

ACDFuse 107374182400 -9444732965617890843136 -14025401856 100% /mnt/amazon

is what is reported when it's over 100GB. Can we please just get it to report 1PB of storage so it will never run out.

Thanks

zenjabba avatar Mar 11 '17 01:03 zenjabba