reallymine icon indicating copy to clipboard operation
reallymine copied to clipboard

Questions about the IMG file that is generated

Open klturi421 opened this issue 7 years ago • 42 comments
trafficstars

I am currently in the process of decrypting the drive and generating a .img file. I am aware that the file will grow in size but I am curious to know if it will be the same size as my drive or will it be the size of the available data? I ask because I have a 3TB drive that I am attempting to decrypt but do not have another 3tb or larger drive to copy the data to, at least at the moment.

Also, once the .img file has been completed, are there any particular instructions on how to mount the file and view the contents?

klturi421 avatar Jun 24 '18 19:06 klturi421

It will be the size of the drive.

What OS are you using?

andlabs avatar Jun 24 '18 20:06 andlabs

I’m running Linux Ubuntu 18.04 LTS.

Edited to reflect Ubuntu instead of Linux.

klturi421 avatar Jun 24 '18 20:06 klturi421

He means Ubuntu.

themaddoctor avatar Jun 24 '18 22:06 themaddoctor

andlabs - due to my currently restricted disk space, is there a way to split the img and go through it individually or does it need to be all in one file?

klturi421 avatar Jun 24 '18 22:06 klturi421

If reallymine supports output to stdout (I don't know if it does), then you could pipe it into something like "split -b 100000000" to make 100MB chunks. But if you break up the image, you will have to put it back together again to mount it. The dmsetup utility can map all the pieces into a virtual complete drive. You'll have to read the manual if you go that route, because I am no expert. Or later, when you have a new disk, you could concatenate the pieces onto the new drive.

themaddoctor avatar Jun 26 '18 17:06 themaddoctor

I decided to go ahead and get a larger (4TB) drive to copy the image to. I began to decrypt the original drive and I noticed that the size of the img stopped growing around 360gb. Is it normal for (I'm guessing) the decryption to stop or slow down at a certain point?

klturi421 avatar Jun 26 '18 18:06 klturi421

I don't think so.

themaddoctor avatar Jun 26 '18 18:06 themaddoctor

I'm currently attempting to decrypt with the binary from issue #38. Is it stable enough for regular use or should I stick to the release from 2016?

klturi421 avatar Jun 26 '18 18:06 klturi421

I don't know. Ask @andlabs

themaddoctor avatar Jun 26 '18 18:06 themaddoctor

You've probably found a bug with it if it's not going past 360GB. What does the first 4096 bytes look like?

andlabs avatar Jun 26 '18 20:06 andlabs

@andlabs forgive my lack of understanding, as for the first 4096 bytes, is that found by running the dumpfirst command?

klturi421 avatar Jun 26 '18 21:06 klturi421

Yes. Send the output to xxd and go up to 00001000.

andlabs avatar Jun 26 '18 21:06 andlabs

I'm sorry to sound like an idiot but what would the code to run be? I've found a few examples that include xxd but I'm not seeing ones that refer to this particular request.

Would something like this work? xxd -p -c 16 kb0.bin > kb0.hex but replacing kb0.bin and kb0.hex with outfile.bin and outfile.hex.

klturi421 avatar Jun 26 '18 22:06 klturi421

Yes, but without the -p and -c 16. Then you can open outfile.hex as a text file and copy the beginning here, up to the line that begins with 00001000.

andlabs avatar Jun 26 '18 23:06 andlabs

After running dumpfirst and xxd here is what I cam up with up until the line 0000100. Something tells me that from what I'm seeing so far things aren't looking to great, but I may be wrong, I hope I am.

Instead of pasting it I've decided to upload the hex file instead due to the length that it would have made this comment. I saved it as .txt since the upload does not permit .hex.

outfile.txt

Also, as of this writing, I started running the previous release (2016) at around 2:30 PM and as of 12:30 AM it has only decrypted roughly 19 gb. The other updated version that runs quicker had completed around 360.3 gb in about an hour. Is it expected that it will take quite a long time to decrypt the 3TB on the previous release?

klturi421 avatar Jun 27 '18 05:06 klturi421

Hm, I'm not noticing any obvious bugs in reallymine there, other than what appears to be some undecrypted zeroes around 4400 or so (but those could be correct)... Will definitely have to investigate further when I can.

And yes, the old binary will be slow :( Sorry

andlabs avatar Jun 27 '18 12:06 andlabs

I re-ran the concurrent file all last night and woke up to error running decrypt: read /dev/sda: input/output error and the file size is again 360.3 on the nose.

Even though it keeps stopping at 360.3gb out of 3tb, is it still possible to attempt mounting the file and seeing if any files are able to be pulled off?

klturi421 avatar Jun 27 '18 17:06 klturi421

Sounds like a bad block on the drive. Try github.com/themaddoctor/linux-mybook-tools There is a PDF there with instructions for doing it in linux.

And if you have the JMS538S chip, I would like a copy of your keyblock, please.

@andlabs Not trying to steal your guy, but he asked.

themaddoctor avatar Jun 27 '18 18:06 themaddoctor

Not a problem. I was going to suggest running badblocks to see if the drive actually was damaged or not. If it is, you're better off using GNU ddrescue before reallymine.

andlabs avatar Jun 27 '18 19:06 andlabs

I did not previously have a backup of the disk and decided to go ahead and run DDRescue on the drive to create a backup of it. With that in mind, I'm guessing I will likely need a second drive of 3TB or larger to decrypt the information to.

I'm at work at the moment but I will run badblocks on it tonight and upload the results then.

@themaddoctor What's interesting is I started my journey of attempting to decrypt the drive by using your guide but ran into a few issues which is how I ended up with @andlabs's tutorial. The problem that I believe that I am having is that once I get to the mounting section of your guide I get an error (going off memory at the moment) along the lines of mount wrong fs type bad superblock. I've been able to follow your tutorial and I have the Symwave chip (non XTS). Would you prefer I start an issue thread on yours as well?

Also so you both are aware, I am fully comfortable with the fact that I may have forever lost the data that is on the drive. I acquired it from my father who had a WD My Book that the USB board stopped working. It was connected to a Windows computer afterwards and a quick format was performed before I got to the drive. At this point, I am merely exploring these options to see what, if anything can be recovered.

klturi421 avatar Jun 27 '18 22:06 klturi421

Well, if it was formatted, even quickly, that explains why you can't mount it. You have to decrypt it and then do data recovery on the decrypted image. Good luck.

themaddoctor avatar Jun 27 '18 23:06 themaddoctor

Once I've got a decrypted image, is there any software that is recommended to use to attempt recovery (Windows or Ubuntu)?

klturi421 avatar Jun 27 '18 23:06 klturi421

Understood. I was just curious and looking to explore any opportunity that I can.

I'm wondering, running the non-concurrent binary of reallymine, what is the average time it takes to decrypt? I'm guessing quite a long time? I've been running it for about 30 hours and have only decrypted 61 GB / 3TB. At the current rate, at least from what I've been able to calculate, it will take ~75 days (~40gb per 24 hours / 3000 gb = 75 days). Is that right?

klturi421 avatar Jun 30 '18 16:06 klturi421

@andlabs I have a DDRescue .img of the drive and now going to attempt to decrypt the file but a little lost with the commands. From what I have read so far about decrypting the file is that I will have to use decryptfile while also include the dek and steps. Is there a way to use the concurrent binary to decrypt the file?

klturi421 avatar Jul 04 '18 21:07 klturi421

You can use the standard decrypt command to decrypt those images too.

andlabs avatar Jul 04 '18 21:07 andlabs

When I attempted to use the standard decrypt command the file size is 0 and stays at 0. But when I try running decryptfile the file size begins to increase. Could I be doing something wrong?

I run the command as follows sudo ~/reallymine decrypt /media/klturi421/Backup/Backup.img /media/klturi421/Backup/Decrypted.img. I have attempted to use the concurrent and non-concurrent releases but neither will increase the file size and shows no errors in terminal.

Due to disk space issues on my Backup drive only 2.7 of 3 TB was able to be copied over. Could this be a reason why the Decrypt option isn't running on the img?

klturi421 avatar Jul 04 '18 22:07 klturi421

No, the decrypt option is trying to find that key sector. If you have the key manually, you can use decryptfile, but it isn't concurrent yet.

(I should probably write all this from scratch again...)

andlabs avatar Jul 05 '18 04:07 andlabs

Could it be possible that the key sector was not copied over in that last .3 gb?

klturi421 avatar Jul 05 '18 04:07 klturi421

The key sector is in the last few MB. Yes.

Get the key sector from the original disk.

themaddoctor avatar Jul 05 '18 05:07 themaddoctor

I was able to "complete" the image and am now able to use the decrypt command. After about 10 minutes of running I am at about 20 GB. I expect it should be finished decrypting around 11:30 AM tomorrow.

klturi421 avatar Jul 05 '18 20:07 klturi421