zip2john, rar2john don't exit non-0 when given incompatible files
So I was hoping that a .ace archive might work with one of the existing archive2johns.
Both zip2john and rar2john don't output anything when given an ace archive file...and they return 0 exit status.
Seems like this will make it difficult to detect error conditions when scripting as we have to check for the absence of stdout rather than presence of output on stderr or non-0 exit code.
Related...
Wondering about the feasibility of contributing an ace2john for ace archives.
I haven't dug into the .ace format yet, but there are at least a couple open source ACE implementations that cover the encryption bit.
- https://github.com/droe/acefile
- https://packages.debian.org/stable/unace
I'd call this a "bug", but these programs are written such that they don't propagate many (perhaps most) potential errors - so let it be a non-trivial "enhancement" to implement that. ;-)
When I've just tried running them on various other file types, I do get some error messages on stderr, though. So the main problem is the zero exit code. @mdp1 If you're able to trigger no output at all, then we might need to know more about what you're doing - a sample file maybe (if you have a small one without any private info in it)?
Wondering about the feasibility of contributing an
ace2johnfor ace archives.
That would need to be not only ace2john, but also support in john proper. Please feel free to contribute that, although I don't recall any demand for it.
Thanks for the reply. I can reproduce this with a simple text file... so maybe it is something specific to the build I have.
I installed/built this with homebrew.
$ john --help | head -n1
John the Ripper 1.9.0-jumbo-1 [darwin18.6.0 64-bit x86_64 SSE4.2 AC]
$ echo 'Hello world!' > testfile
$ zip2john testfile
$ echo $?
0
Regarding the contrib, I also think ace is pretty much abandoned...but some people (like me) might have old archives from previous decades they have forgotten the passwords for. We will see if I get around to digging into the implementation. These old archives are low priority at the moment.
Many changes were introduced after jumbo 1.
The zero exit code is somehow undesired, but the new zip2john gives us a good hint.
$ john-the-ripper.zip2john testfile; echo $?
Did not find End Of Central Directory.
0
Many changes were introduced after jumbo 1.
Nice to hear there are already some improvements on the stderr front.
It looks like 1.9.0-jumbo-1 is from May 2019.
Will there be a new release at some point or should I just build the bleeding-jumbo@HEAD?
Sorry if I missed this somewhere. The git repo readme still references an INSTALL file which doesn't appear to be there anymore.
Will there be a new release at some point or should I just build the
bleeding-jumbo@HEAD?
Both.
The git repo readme still references an INSTALL file which doesn't appear to be there anymore.
It's under doc/ like it always was. Maybe we need to edit README.md to make that explicit.
In part this is an issue of defining how most/all 2john tools should behave (do's, dont's, optionals). Some might like to run rar2john some/directory/* and get the output for rar archives while simply ignoring others. One could argue that you should simply use rar2john some/directory/*.rar some/directory/*.RAR instead.
I'm not aware of any 2john tool that exits with error if nothing was found but I'm probably wrong (some of them probably do).
Good point, @magnumripper. It's in fact non-obvious what to do when only some of the files were processed correctly. I suggest that we do exit non-zero in such cases. If we really want, we can choose two different non-zero exit codes for complete vs. partial failure, but that's probably excessive.
BTW, most of the other 2john tools are very quiet in terms of diagnostics/progress messages to stderr, unless -v is given. We might want to limit rar2john's stderr output the same way.
So what would be a sensible exit code for "some files succeded"? The only one I think would be fairly good of those listed in man sysexits is 65 "The input data was incorrect in some way". Or maybe we could simply use 3 (or even 2 but it's listed in several places for "misuse of shell builtins").
On a related note I've long wanted to exit john with something other than 1 (yet not 0) for things like --max-run=N when session aborts after N seconds, and similarly for some other options such as --max-candidates=N.
Of course I'm new to the party, so my opinion shouldn't bear much weight. Nevertheless a few thoughts...
When someone runs rar2john some/directory/* the shell expansion is done by the shell and wildcards is not a feature of rar2john.
ie rar2john some/directory/* is converted into:
rar2john some/directory/file01 some/directory/file02 some/directory/file03.rar some/directory/file04.jpg some/directory/file05.zip
So rar2john is being fed a list of files explicitly (thanks to shell expansion) these files including non-rar files.
From an CLI API perspective rar2john doesn't known if I called it with shell expansion or with an explicit list of files from the user (or script).
I would expect that if you feed rar2john input that it is not expecting or cannot handle that this would be an error condition (specific problems listed on stderr and non-0 exit). If the user's expectation is that the tool will produce an output for each input file given... and if the tool cannot do that for some of the inputs... that is something to know right?
If you had a recursive flag (e.g. like grep -R) that was part of the 2john CLI api... then you are going beyond shell expansion (rar2john -R some/directory) and need to define the behaviour of what happens in the recursive case. Are only files with a certain extension processed or are all files processed and errors reported or are all files processed and errors ignored or something else? Maybe additional flags are needed to customize whatever the default recursive behaviour is.
In the normal, non-recursive case I think it would preferable to add an --ignore-errors flag if people really want this behaviour rather than having the default be to silently ignore errors when explicitly fed invalid input files.
stderr output is good for user experience so a user can troubleshoot what went wrong (file01 does not appear to be a rar file). However for scripting, a non-0 exit is pretty important.
But maybe this comes down to whether you consider rar2john not being able to recognize an input file as rar data as an error condition the user might care about. I can imagine a user having a zip file that has been misnamed as .rar and running rar2john against it in a script and having to troubleshoot what is not working and why without any error output or non-zero exits.
In terms of sensible exit code. (@magnumripper posted while I was typing) I'm not sure whether or not it makes sense to differentiate between all success, some success and no success. If a user is running this manually they probably care more about ~stdout~/err than exit code. If it is a script the complexity of handling the "some" case is probably better handled by running *2john on a single file at a time and observing the exit code as success/fail so the logic for handling failure/retry/reporting can be done outside of the *2john at the scripting layer. In terms of the right code for "not 100% success"...I don't have an opinion.
Edit: forgot 2john outputs result(s) to stdout so all progress normal logging has to go to stderr. I would differentiate between stderr progress/debug verbosity and actual error reports due to bad input as well...I would expect to see errors on stderr without having to use a verbose flag (otherwise you will get lots of queries...why isn't this working? from people that don't know they need to use the verbose flag to see errors). But could have a --silent flag for scripting purposes to shut off all output including errors (a la curl).
Yeah maybe we should just have 0 for "all success" or 1 for "something's amiss" and be done with it. But I'll sleep on it (and any script doing things like rar2john && echo AOK || echo Problem would work just the same even if we added a third exit code).
I agree errors/warnings should always go to stderr - the -v would turn on further diagnostics such as "Found encrypted file this or that, packed len x, unpacked y" and so on.