Data Recovery Resources

Blues

Well-Known Member
Reaction score
503
Location
Tennessee, US
Have a client who recently had their computer cleaned and unfortunately it was cleaned with no regard for data. The drive is still with another tech and I intend to attempt recovery myself. I am limited more basic options and wanting some input and feedback on what potentially may result in success. I would say based on the clients description it sounds like their PC was infected and the last tech just did a nuke and pave. What tools and methods would you all recommend? I have not had to recover lost data in years and this is a new client.
 
Is this a fishing expedition? As in just seeing if you can get something? Make sure to discuss with the customer paid options. If you haven't looked around data reco prices are not all in the thousands of dollars. They might be happy paying a 200-300. Either way look at @lcoughey's link as it has plenty of information.
 
Best Chance? Send to Luke @lcoughey

Do not boot to it, do not let the client use it. As Luke said, cost, but also skill. I'd get an bare metal image of the drive immediately. If they have started using it, or used it a lot, I'd wash your hands of it if they don't want to use a lab.

Cheap option: testdisk/photorec. Its okay. Awesome for cherry picking out filetypes but will leave a heck of a mess for a client, and not suitable for accounting/etc where multiple files have to be intact, or filetypes without clear header/footer markers.

One I preferred: R-Studio. They have a T80+ subscription model instead of dropping $800. There is a LOT of skill to recovering via such a tool.
 
We use a place that's 45 minutes from us...right next door to us in Rhode Island

I see your in TN...it's still not that far for shipping....
 
Since I don't have the drive I can't confirm it is not a SSD but based on the hardware and client I would say a safe bet that it is a mechanical HDD. I would be looking for fairly low investment given how long it has been since I have had need to do anything like this. Right now I am trying to research so I can prepare and share any costs which would be passed on to the client. We did discuss the possibility of having it sent off for a deeper recovery option so I will look into things like @YeOldeStonecat where it might be outsourced. I am working on proposals for things to reduce this issue for the future which is something I have and continue to do so as to avoid these problems before they turn costly.
 
Since I don't have the drive I can't confirm it is not a SSD but based on the hardware and client I would say a safe bet that it is a mechanical HDD. I would be looking for fairly low investment given how long it has been since I have had need to do anything like this. Right now I am trying to research so I can prepare and share any costs which would be passed on to the client. We did discuss the possibility of having it sent off for a deeper recovery option so I will look into things like @YeOldeStonecat where it might be outsourced. I am working on proposals for things to reduce this issue for the future which is something I have and continue to do so as to avoid these problems before they turn costly.
If you can get your hands on the drive before shipping it out you can make a full disk image. And have that to play with to compare your results vs a third party. If you are going to do that you'll always want to have an untouched image so that'll mean two images. @MudRock mentioned [url-https://www.r-studio.com/]R-Studio[/url] which is very popular. The good news is you can download and run it under a trial version which allows pretty much everything but limits file recovery to something like 65kb
 
Right now I am trying to research so I can prepare and share any costs which would be passed on to the client. We did discuss the possibility of having it sent off for a deeper recovery option

When it comes to recovery of files from a healthy drive, there's no such thing as 'deeper recovery' unless we talk about data what was 'unmapped' from LBA user space due to for example TRIM commands being sent to the drive and the drive responding to those. If we talk purely logical recovery where no TRIM like mechanisms are involved then there's not a whole lot a data recovery company can do more than you unless we take very specific scenarios into consideration. Some tools may refer to 'deep recovery' or 'deep scan' but what they mostly mean by that is raw recovery or signature based recovery rather than file system based recovery.

In general you want to avoid such 'deep recoveries' because they come with several disadvantages; files are recovered without original directory structure and without original filenames. And almost by definition recovery of non contiguous files (fragmented) will fail while these same files could be recoverable if we for example assume NTFS and take the file system meta data into consideration. These deep or raw recoveries should be last resort attempts. Now you could argue when you're dealing with for example lost photos on a memory card a raw recovery isn't the end of the World and I agree.

I am not arguing you should not send a drive to a lab even when you assume it's a purely logical recovery:
- You never know for 100% certain the issue is purely logical or some underlying hardware issue is luring. No tech wants to be in the position where he has to call the client that asked him to recover some deleted files and have to tell the drive just failed during the attempt.
- A lab will follow standard protocol, for example it will always first image/clone a drive using specialized hard/software, and from that point on only work with the clone. Even if you're not a lab, this is good practice: always clone/image the patient drive no matter how trivial and minor the issue seems.
- An experienced lab tech will have several software options available to him and pick the one best suited for the scenario at hand, or simply try multiple and pick the one that gives bast results. Remember, if you work with the clone, you can try over and over without additional risk.
- A lab tech often has peers, friendly labs, who he knows to be specialists in certain types of scenarios (think complex RAID recovery, video recovery, specific file system such as ZFS, etc.).
 
Last edited:
- A lab will follow standard protocol, for example it will always first image/clone a drive using specialized hard/software, and from that point on only work with the clone. Even if you're not a lab, this is good practice: always clone/image the patient drive no matter how trivial and minor the issue seems.
Saying I used to always joke about "Clone first, ask questions later" -- Even if you don't think there is a risk to client data, clone anyways. For the few extra minutes and waiting you'll have to do, I will tell you with certainty, you WILL eventually thank yourself for doing so.

Even if the hard drive was 100% perfect shape, it could still die on the operating table by simply existing. It is one thing telling a client that the drive died on your desk with no prior indicators; It would be another to tell them that the data died with it.

ALSO: If you do it on your own, make a copy of the image, store it somewhere safe (Heck, 2 copies) - Do not touch that copy unless you bork the original image, and only then do you make a copy from that image again. Never work from the original source drive, never work on the only image.
 
I don't know if the drive is a healthy drive right now and I will bring up possibilities when dealing with uncertain details.

Just to be clear with the cloning option is this just to ensure the drive and any files currently on it are safe or does this also safe guard some of the files you can potentially recover? In this particular job I am currently under the impression that the current data and state of the drive is irrelevant in which case while the cloning is best practice this case would show it to be unnecessary if it does not potentially capture the data I am after. I would figure not all cloning is equal and some do not capture more than current usable data.
 
I don't know if the drive is a healthy drive right now and I will bring up possibilities when dealing with uncertain details.

Just to be clear with the cloning option is this just to ensure the drive and any files currently on it are safe or does this also safe guard some of the files you can potentially recover? In this particular job I am currently under the impression that the current data and state of the drive is irrelevant in which case while the cloning is best practice this case would show it to be unnecessary if it does not potentially capture the data I am after. I would figure not all cloning is equal and some do not capture more than current usable data.

To be clear by imaging we're talking about a block by block copy. Traditional backup software programs usually don't do that.

At the end of the day the objective is to maximize the probability of data recovery regardless of what has happened. Since the patient can die on the operating table at anytime, especially if it's having "problems", it's very important to grab that image. This is why it's so important, once someone realizes a data recovery is needed, that no more efforts to turn the drive on are performed until it's imaged. And to note, you really need to have two images if the drive is in fact having hardware/firmware issues. Especially if its a solid state type drive. Reason being if you only make one image and something goes south, such as executing the wrong command or a hardware problem you'd have to use the patient again. That's a belt and suspenders approach. If one knows what they're doing then the image will be accessed in read mode only.
 
I figured that was what was meant but best to ensure I am understanding you all correctly. For you when doing 2 images do you actually image the drive twice or image once then backup that image?
 
I figured that was what was meant but best to ensure I am understanding you all correctly. For you when doing 2 images do you actually image the drive twice or image once then backup that image?
Image once, make a copy and work from the copy (On another drive) - You could make a backup of your copy too.

Validation of the image, while always encouraged, isn't always the right answer. (As we move into SSD recovery, I think validation could corrupt/invalidate a backup which could be experiencing data rot/bit flips. -- I think this could be a good cause for debate in itself.)

And again, if the drive is potentially failing, the decision of imaging prior to sending for recovery may be gray area as well. If you are sending it to a data recovery specialist, I would recommend asking them for their input. One thing that favors imaging before sending a logical-only repair, is if the drive is lost in shipment. But a drive which may have any mechanical failure, that is usually asking for problems.

If you do think you'll start doing some of this stuff on your own, I would recommend picking up a write-blocker/USB stabilizer of sorts as well. R-Studio full does actually include a free* one, but there are other options out there too. Especially in a case like this, making well sure Windows (Or anything else) doesn't write over any more data is imperative; Bit more skilled yet again, but taking this to linux might be safer too.
 
Last edited:
I figured that was what was meant but best to ensure I am understanding you all correctly. For you when doing 2 images do you actually image the drive twice or image once then backup that image?
Image the patient then make a copy of the copy. The objective it to exercise the patient as little as possible. As @MudRock said if you're sending it out talk to the data reco folks about you making a copy for "just in case"
 
Let me add something I hope you already know to the excellent advice given above. You need to understand the tool you use to create the backup image. I use Macrium for my day-to-day backup tasks of customer disks before working on them. But, the default (for Macrium) is to only backup sectors that are in use - i.e.: if there are sectors that formerly had data in them and now are marked 'available' for writing, Macrium will not include them in the image and you won't be able to recover the customer's data. You need to go into settings and check the box to make Macrium back up every sector, not just the ones with live data. I think they call this a 'Forensic sector copy'. Good luck!
 
Let me add something I hope you already know to the excellent advice given above. You need to understand the tool you use to create the backup image. I use Macrium for my day-to-day backup tasks of customer disks before working on them. But, the default (for Macrium) is to only backup sectors that are in use - i.e.: if there are sectors that formerly had data in them and now are marked 'available' for writing, Macrium will not include them in the image and you won't be able to recover the customer's data. You need to go into settings and check the box to make Macrium back up every sector, not just the ones with live data. I think they call this a 'Forensic sector copy'. Good luck!
It may sound like a good idea and about the same as the specialized cloners like ddrescue and hddsuperclone but it isn't. I'd strongly recommend against using a tool like Macrium for data recovery purposes.
 
Last edited:
And again, if the drive is potentially failing, the decision of imaging prior to sending for recovery may be gray area as well.

This was my immediate thought as well. You could argue, imaging the patient drive is the thing that data recovery is all about.

Once you have an image file you're in so much better shape, most of a data recovery engineer's efforts are working towards this goal. At the same time, imaging is about the only interaction you have with the patient drive, and thus also the process where you're actually 'stressing' the drive and where the risk is. Imaging/cloning or making this possible is what professional tools are about for a large part.

So, you decide to outsource a drive so to someone to do what he's specializes in, for good reasons probably, and the next thing you do is put the data at risk first by imaging it using tools you are less capable than what the data recovery guy has. It doesn't make sense to me.
 
Last edited:
So, you decide to outsource a drive so to someone to do what he's specializes in, for good reasons probably, and the next thing you do is put the data at risk first by imaging it using tools you are less capable than what the data recovery guy has. It doesn't make sense to me.
+1. The patient could die because the tech tried to image a highly unstable drive. The DR tech could image just the required data in a sequential LBA manner, to minimize the thrashing that would take place using file back-up utilities and without trying to image irrelevant or empty sectors by imaging the whole drive. Disabling background tasks* and repairing firmware issues first also reduces strain on the patient and that's not possible without DR tools.

* Edit to add: and disabling weak/dead heads,
 
Last edited:
If it had as you say a nuke and pave you won't be able to do it yourself and doubtful anybody else will be successful, you might get a few worthless files back without any structure
 
Well, if we assume 1 TB hard drive that was filled like 50-60%, then there's plenty of data untouched after a clean Windows install.

But it may be far from optimal recovery, possible/probably largely result of raw recovery or signature scan with all it's side effects such as lack of file names and folder structure. In this type of scenario I recommend trying more than one tool and see what gives best result. My default goto tool is DMDE, I think Luke's UFS Explorer. R-Studio is an option and there's those that recommend GetDataBack specifically for this type of situation. If client is for example specifically interested in JPEGs, a more specialized carving tool may skip large portion of corrupt files and duplicates, plus avoid 'inflated' JPEGs.
 
Last edited:
Back
Top