Q&A: Windows forensics

Harlan Carvey, CISSP, is a computer security engineer and book author. He has conducted penetration tests and vulnerability assessments in support of corporate and federal government clients. He has also performed a wide range of incident response activities, and conducts computer forensics research, with specific attention to the Microsoft Windows family operating systems. In this interview, he discusses Windows forensics, forensics in general, as well as his latest book.

How has Windows forensics evolved since the days of Windows XP? What does Windows 7 bring to the table?
Microsoft has a well-established habit of changing things up for forensic analysts… look at how memory analysis changes not only between versions of Windows, but in some cases, between Service Packs. Between XP and Vista, there were changes in how some information is recorded in files on the system, in particular in the Registry and the Event Logs.

As Microsoft “evolves” the user experience and adds complexity and functionality to the operating system and applications, what we’re seeing isn’t necessarily that forensic artifacts are going away, but rather that they’re moving. As such, there’s been a great deal of research in the community to map those artifacts, but the fact remains that there needs to be a great deal more in order to understand what interactions lead to the creation or modification of an artifact.

In your opinion, what are the most important skills that aspiring forensic examiners should be working on?

Aspiring examiners should focus on the core basics of analysis. Too many experienced examiners fall into the trap of filling in gaps in analysis and knowledge with assumption and speculation, even doing so knowingly. From the beginning, examiners need to thoroughly understand the goals of their analysis, what questions need to be answered, and from there look for all information possible to support or refute their findings. Opinions serve a limited purpose in analysis when it comes to exploring other avenues, but replacing facts and analysis with speculation and assumption is just lazy.

Also, aspiring examiners should develop within themselves the desire to stay current in their field, regardless of what training is provided. One of the issues seen within the community is that business models for incident response and computer forensic analysis do not keep up with technology, in that hiring someone and providing no means whatsoever for regular, up-to-date training is still the unfortunate norm.

Develop your documentation skills now! Technical folks are known for not documenting what they do, but that’s also the biggest obstacle to process improvement. Thoroughly documenting acquisitions, data collection and analysis lets you go back later and pull information out that is useful or critical in a current engagement.

Finally, pick a programming language and learn how to use it to meet your needs. Too often, there’s a slow-down or gaps in an examination due to the shear volume of data, and some ability to program can help you perform a wide variety of tasks, including the ability to automate repetitive tasks and increase speed while reducing mistakes. Also, just learning to program helps you with your thought processes and breaking larger tasks down into smaller ones, in order to achieve an overall goal.

Which Windows forensics tools would you recommend to our readers?
My favorite “forensics tool” is Cory Altheide. 😉 But seriously, I tend not to recommend commercial tools, as doing so seems to create an over-reliance on these tools, where the reliance should be on the examiner’s ability to understand the goals of the examination, as well as their ability to develop an appropriate analysis plan. The “tool” I recommend is “wet-ware”, or your brain. If you don’t know what “Registry analysis” consists of and what you’re trying to prove or disprove through this activity, then no tool, free or commercial, is going to be of any use. A builder doesn’t decide what a building will look like based on the tools that are available, and throughout history, new tools have been developed because a need was recognized and understood. The same should be true for incident response and forensic analysis – understand the need first, then choose the tool.

What is the basic forensic analysis environment one has to setup in order to learn more about this discipline?
Not a great deal is required in order to learn more about the discipline. A laptop running the basic tools is all you really need, as there are enough free tools and information available to really build up some basic analysis skills. Several sites on the Internet provide some free acquired images, and using any number of free tools, you can acquire your own images. So really, the “basic environment” consists of nothing more than the desire.

One of the soft skills that can really benefit any examiner is networking, particularly with other folks in the community, as well as just outside the community. Building personal networks gives you an excellent resource for sharing your own experiences, as well as a trusted facility for asking questions. It also gives you practice communicating with others, particularly those who may not be as technical as you are. Many times when triaging an incident or trying to develop a set of goals to direct my analysis, I end up working with non-technical managers, as well as people with technical specialties different from my own. Tie this back to the original question, part of that “basic forensic analysis environment” may include more than just a workstation… it may include a local Starbucks or pub.

What are some of the odd tidbits you’ve encountered on systems you’ve analyzed?
As a corporate analyst, I don’t get a great deal of time to really dig into some of the tangential artifacts I find during an examination. Once the customer’s goals have been thoroughly addressed, I may notify them of some of the ancillary findings, but I don’t often get a chance to really dig into them, as billing needs to be closed out and the data needs to be handled in accordance with the customer’s needs.

Now and again, I do find some interesting artifacts. Once during an engagement, I ran across a piece of malware that actually used the “Image File Execution Options” Registry key as it’s persistence mechanism; until then, I’d only see this mentioned on anti-virus web sites and while I understood how it worked, I’d never actually seen it being used. On another engagement, I put together a timeline of system activity based on a wide range of information sources extracted from the acquired image. In this case, I was aware that one of the pieces of malware placed on the system was in fact a Perl script “compiled” with Perl2Exe and placed on the system as an .exe file. When such a program is run, the embedded Perl modules are extracted from the .exe file to a temporary directory as DLLs, and each time the program was run, a slightly different path was used. While I had only one copy of the .exe file, I had a historical record of each time that program was run, which clearly illustrated the window of compromise.

I think that as more attention is given to areas such as memory, Registry, and timeline analysis, and aspects of the different versions of Windows are explored (ie, XP Restore Points, etc.), we’ll begin to find more “tidbits”. For example, some of my recent exploration into how Windows XP and Vista maintain information about wireless networking seems to indicate that when the system connects to a wireless access point, the MAC address of that WAP is recorded. This “tidbit” needs to be examined further, but it has fairly enormous implications, particularly for law enforcement.

What challenges did you encounter while writing Windows Forensic Analysis DVD Toolkit, Second Edition?
The main challenges were what information to include in the book, and how to go about including it. With the second edition, I recognized that there needed to be less focus on individual aspects of analysis (such as just memory analysis, or just Registry analysis) and more material that clearly illustrated to the reader how all of this information could be used together. This is what lead me to creating an entirely new chapter of nothing but “war stories” to illustrate how to correlate information from, say, Registry analysis and the file system to paint clearer picture.

Another huge challenge is what I’m able to provide with respect to detailed information. Most folks don’t seem to realize that receiving book royalties doesn’t make an author rich, and as a one-man “shop”, I have very limited assets. In fact, my “lab” consists primarily of VMWare Workstation and some VMs running on the same system I’m using to answer this email. When it comes to hardware-specific artifacts, such as Fireware or Bluetooth device information, what I can provide is rather limited.

This also applies to application-specific artifacts (ie, Registry settings, log files, etc.), as well. There’s only so much information that can be extracted in a limited environment, and what really needs to happen is that applications need to be tested in the manner that they’re used for the artifacts resulting from the research to be of value.

Are you satisfied with the community’s response to the book?
I don’t have much in the way of official numbers for book sales at this point, and I think it’s really more of a question of the community being happy with the book. I see the reviews posted on Amazon, and it is pretty clear that the community is responding favorably. I also receive emails directly from some folks, and would love to see those comments posted publicly somewhere.

Will there be a third edition?
We’ll see. I think that there’s a definite need for a third edition, and I think that the sales numbers will bear that out for the publisher to support a third edition. However, the limiting factors I envision right now are two-fold:

1. Having a limited testing environment based on virtual machines has only limited use. Communications mechanisms such as 802.11n and Bluetooth need to be explored in depth; the same is true with applications. This includes not only what can be found through post-mortem analysis but also how to address issues during live response.

2. As the operating systems get more complex and things move around more, the need for credible, verifiable information about operating system and application artifacts is critical. Too much of what we discover in this field is by accident, and is verified only through testing and a modicum of “reverse engineering”. Getting some support from vendors in verifying information would be extremely beneficial to the community.

What are your future plans? Any exciting new developments?
I’d like to focus more research on Windows Vista and 7, but that’s going to be limited by availability of hardware and software to really dig into some important aspects of both of those OSs. I see RegRipper and that family of tools, including RipXP, as being very relevant for quite a while, so I’m planning to continue development in that area, dovetailing that work off of the research I’m able to conduct. Also, I’ve been expanding my research into timeline development and analysis, as well as alternative methods of analysis, in general.

Don't miss