Gabriel Burca
Gabriel Burca
BTW, I already added a list_parts() method to GlacierVault as part of an earlier commit. It takes as argument the uploadID (aka multipart_id). See: https://github.com/uskudnik/amazon-glacier-cmd-interface/blob/21f5005da01a8eb495878fabccc8ed647c92b927/glacier/glaciercorecalls.py#L147
> currently upload sends parts sequentially ... So missing intermediate parts shouldn't happen. That's assuming the user used this tool to create the failed upload.
Strictly speaking, all you really need in order to resume an upload is the original archive/data and the uploadID. Agreed? SimpleDB would only be needed to support higher-level, user-friendly features....
@wvmarle Sure, but you have to store the ID somewhere. No, you don't. In fact storing it locally (or in SimpleDB) leads to other issues. Let me repeat what I...
See my comment on issue #69 for why the pagination marker is no longer working...
> And I'm considering a function like taking a tree hash of files the user wants to upload, check it against the bookkeeping, and print a warning if this file...
See: https://github.com/gburca/YouTransfer/blob/master/src/templates/message.html
I think this was fixed in ee7923fb and then further fixed in efa60d75. With that in mind, you should also check that the local storage path (see `settings.json`) is set...
I don't know what (if anything) can be done about that. The file type check is pretty minimal. Can you try to see if changing the `*` in RegEx to...
That would require a code change. There's currently no way to filter directories by the number of entries they contain.