More About Volume Shadow Copies

Tuesday, May 8, 2012 Posted by Corey Harrell 0 comments

CyberSpeak Podcast About Volume Shadow Copies


I recently had the opportunity to talk with Ovie about Volume Shadow Copies (VSCs) on his CyberSpeak podcast. It was a great experience to meet Ovie and see what it’s like behind the scenes. (I’ve never been on a podcast before and I found out quickly how tough it is to explain something technical without visuals). The CyberSpeak episode May 7 Volume Shadow Copies is online and in it we talk about examining VSCs. In the interview I mentioned a few different things about VSCs and I wanted to elaborate on a few of them. Specifically, I wanted to discuss running the Regripper plugins to identify volumes with VSCs, using the Sift to access VSCs, comparing a user profile across VSCs, and narrowing down the VSC comparison reports with Grep.

Determining Volumes with VSCs and What Files Are Excluded from VSCs


One of my initial steps on an examination is to profile a system so I can get a better idea about what I’m facing. I information I look at includes: basic operating system info, user accounts, installed software, networking information, and data storage locations. I do this by running Regripper in a batch script to generate a custom report containing the information I want. I blogged about this previously in the post Obtaining Information about the Operating System and I even released my Regripper batch script (general-info.bat). I made some changes to the batch script; specifically I added the VSCs plugins spp_clients.pl and filesnottosnapshot.pl. The spp_clients.pl plugin obtains the volumes monitored by the Volume Shadow Copy service and this is an indication about what volumes may have VSCs available. The filesnottosnapshot.pl plugin gets a list of files/folders that are not included in the VSCs (snapshots). The information the VSCs plugins provide is extremely valuable to know early in an examination since it impacts how I may do things.

While I’m talking about RegRipper, Harlan released RegRipper version 2.5 his post RegRipper: Update, Road Map and further explained how to use the new RegRipper to extract info from VSCs in the excellent post Approximating Program Execution via VSC Analysis with RegRipper. RegRipper is an awesome tool and is one of the few tools I use on every single case. The new update lets RR run directly against VSCs making it even better. That’s like putting bacon on top of bacon.

Using the Sift to Access VSCs


There are different ways to access VSCs stored within an image. Two potential ways are using Encase with the PDE module or the VHD method. Sometime ago Gerald Parsons contacted me about another way to access VSCs; he refers to it as the iSCSI Initiator Method. The method uses a combination of Windows 7 iSCSI Initiator and the Sift workstation. I encouraged Gerald to do a write-up about the method but he was unable to due to time constraints. However, he said I could share the approach and his work with others. In this section of my post I’m only a ghost writer for Gerald Parsons and I’m only conveying the detailed information he provided me including his screenshots. I only made one minor tweak which is to provide additional information about how to access a raw image besides the e01 format.

To use the iSCSI Initiator Method requires a virtual machine running an iSCSI service (I used the Sift workstation inside VMware) and the host operating system running Windows 7. The method involves the following steps:

Sift Workstation Steps

1. Provide access to image in raw format
2. Enable the SIFT iSCSI service
3. Edit the iSCSI configuration file
4. Restart the iscsitarget service

Windows 7 Host Steps

5. Search for iSCSI to locate the iSCSI Initiator program
6. Launch the iSCSI Initiator
7. Enter the Sift IP Address and connect to image
8. Examine VSCs

Sift Workstation Steps


1. Provide access to image in raw format

A raw image needs to be available within the Sift workstation. If the forensic image is already in the raw format and is not split then nothing else needs to be done. However, if the image is a split raw image or is in the e01 format then one of the next commands needs to be used so a single raw image is available.

Split raw image:

sudo affuse path-to-image mount_point

E01 Format use:

sudo mount_ewf.py path-to-image mount_point

2. Enable the SIFT iSCSI service

By default, in Sift 2.1 the iSCSI is turned off so it needs to be turned on. The false value in the /etc/default/iscsitarget configuration file needs to be change to true. The commands below uses the Gedit text editor to accomplish this.

sudo gedit /etc/default/iscsitarget

(Change “false” to “true”)


3. Edit the iSCSI configuration file

The iSCSI configuration file needs to be edited so it points to your raw image. Edit the /etc/ietd.conf configuration file by performing the following (the first command opens the config file in the text editor Gedit):

sudo gedit /etc/ietd.conf

Comment out the following line by adding the # symbol in front of it:

Target iqn.2001-04.com.example:storage.disk2.sys1.xyz

Add the following two lines (the date can be whatever you want (2011-04) but make sure the image path points to your raw image):

Target iqn.2011-04.sift:storage.disk
Lun 0 Path=/media/path-to-raw-image,Type=fileio,IOMode=ro


4. Restart the iscsitarget service

Restart the iSCSI service with the following command:

sudo service iscsitarget restart


Windows 7 Host Steps


5. Search for iSCSI to locate the iSCSI Initiator program

Search for the Windows 7 built-in iSCSI Initiator program


6. Launch the iSCSI Initiator

Run the iSCSI Initiator program

7. Enter the Sift IP Address and connect to image

The Sift workstation will need a valid IP address and the Windows 7 host must be able to connect to the Sift using it. Enter the Sift’s IP address then select the Quick Connect.


A status window should appear showing a successful connection.


8. Examine VSCs

Windows automatically mounts the forensic image’s volumes to the host after a successful iSCSI connection to the Sift. In my testing it took about 30 seconds for the volumes to appear once the connection was established. The picture below shows Gerald’s host system with two volumes from the forensic image mounted.


If there are any VSCs on the mounted volumes then they can be examined with your method of choice (cough cough Ripping VSCs). Gerald provided additional information about how he leverages Dave Hull’s Plotting photo location data with Bing and Cheeky4n6Monkey Diving in to Perl with GeoTags and GoogleMaps to extract metadata from all the VSCs images to create maps. He extracts the metadata by running the programs from the Sift against the VSCs.

Another cool thing about the iSCSI Initiator Method (besides being another free solution to access VSCs) is the ability to access the Sift iSCSI service from multiple computers. In my test I connected a second system on my network to the Sift iSCSI service while my Windows 7 host system was connected to it. I was able to browse the image’s volumes and access the VSCs at the same time from my host and the other system on the network. Really cool…. When finished examining the volumes and VSCs then you can disconnect the iSCSI connection (in my testing it took about a minute to completely disconnect).


Comparing User Profile Across VSCs


I won’t repeat everything I said in the CyberSpeak podcast about my process to examine VSCs and how I focus on the user profile of interest. Focusing on the user profile of interest within VSCs is very powerful because it can quickly identify interesting files and highlight a user’s activity about what files/folders they accessed. Comparing a user profile or any folder across VSCs is pretty simple to do with my vsc-parser script and I wanted to explain how to do this.

The vsc-parser is written to compare the differences between entire VSCs. In some instances this may be needed. However, if I’m interested in what specific users were doing on a computer then the better option is to only compare the user profiles across VSCs since it’s faster and provides me with everything I need to know. You can do this by making two edits to the batch script that does the comparison. Locate the batch file named file-info-vsc.bat inside the vsc-parser folder as shown below.


Open the file with a text editor and find the function named :files-diff. The function executes diff.exe to identify the differences between VSCs. There are two lines (lines 122 and 129) that need to be modified so the file path reflects the user profile. As can be seen in the picture below the script is written to use the root of the mounted image (%mount-point%:\) and VSCs (c:\vsc%%f and c:\vsc!f!).


These paths need to be changed so they reflect the user profile location. For example, let's say we are interested in the user profile named harrell. Both lines just need to be changed to point to the harrell user profile. The screenshot below now shows the updated script.


When the script executes diff.exe there the comparison reports are placed into the Output folder. The picture below shows the reports for comparing the harrell user profile across 25 VSCs.


Reducing the VSCs Comparison Reports


When comparing a folder such as a user profile across VSCs there will be numerous differences that are not relevant to your case. One example could be the activity associated with Internet browsing. The picture below illustrates this by showing the report comparing VSC 12 to VSC11.


The report showing the differences between VSC12 and VSC11 had 720 lines. Looking at the report you can see there are a lot of lines that are not important. A quick way to remove them is to use grep.exe with the –v switch to only display non-matching lines. I wanted to remove the lines in my report involving the Internet activity. The folders I wanted to get rid of were: Temporary Internet Files, Cookies, Internet Explorer, and History.IE5. I also wanted to get rid of the activity involving the AppData\LocalLow\ CryptnetUrlCache folder. The command below shows how I stacked my grep commands to remove these lines and I saved the output into a text file named reduced_files-diff_vsc12-2-vsc11.txt .

grep.exe -v "Temporary Internet Files" files-diff_vsc12-2-vsc11.txt | grep.exe -v Cookies | grep.exe -v "Internet Explorer" | grep.exe -v History.IE5 | grep.exe -v CryptnetUrlCache > reduced_files-diff_vsc12-2-vsc11.txt

I reduced the report from 720 lines to 35. It’s good practice to look at the report again to make sure no obvious lines were missed before running the same command against the other VSC comparison reports. Staking grep commands to reduce the amount of data to look at makes it easier to spot items of potential interest such as documents or Windows link files. It’s pretty easy to see that the harrell user account was accessing a Word document template, an image named staples, and a document named Invoice-#233-Staples-Office-Supplies in the reduced_files-diff_vsc12-2-vsc11.txt report shown below.


I compare user profiles across VSCs because it’s a quick way to identify data of interest inside VSCs. Regardless, if the data is images, documents, user activity artifacts, email files, or anything else that may stored inside a user profile or that a user account accessed.


Practical Malware Analysis Book Review

Thursday, May 3, 2012 Posted by Corey Harrell 0 comments
There are times when I come across malware on systems. It happens when I’m helping someone with computer troubles to processing a DFIR case to providing assistance on a security incident. It seems as if malware is frequently lurking beneath the surface. Occasionally I thought it might be helpful to know not only what the malware on those systems was up to but also what the malware was incapable of doing. Practical Malware Analysis breaks down the art of analyzing malware so you can better understand how it works and what its capabilities are. PMA is an excellent book and I highly recommend it for the following reasons: understanding malware better, training, and extending test capabilities.

Understanding Malware Better


A very telling quote from the book’s opening is “when analyzing suspected malware, your goal will typically be to determine exactly what a particular suspect binary can do, how to detect it on your network, and how to measure and contain its damage”. Practical Malware Analysis outlines how to meet that goal by outlining a process to follow and the tools to use. Part 1 covers basic analysis demonstrating how to better understand a program’s functionality by using basic static and dynamic analysis. Part 2 builds on the basic analysis by diving deeper into static analysis by analyzing the malware’s assembly code. Part 3 continues by discussing an advanced dynamic analysis technique which was debugging. The book is written in a way where it is fairly easy to follow along and understand the content about the analysis techniques. The later sections in the book: Part 4 Malware Functionality, Part 5 Anti-Reverse-Engineering, and Part 6 Special Topics provided a wealth of information about malware and what someone may encounter during their analysis.

I don’t foresee myself becoming a malware reverse engineer. This wasn’t what I had in mind when I started reading PMA. My intentions were to learn the techniques in PMA so I could be better at my DFIR job. To quickly get intelligence when I’m examining an infected system to help explain what occurred. To be able to rule out malware located on systems from being accused of the reason why certain actions happened on a system. PMA went beyond my expectations and I can honestly say I’m better at my job because I read it.

Training


Practical Malware Analysis follows the No Starch publishing practical approach which is to reinforce content by providing data the reader can analyze as they follow along. The book provides a wealth of information about analyzing malware then follows it up with about 57 labs. The authors indicated they wrote custom programs for the book and this means there are a lot of samples to practice the malware analysis techniques on. The labs are designed so the reader has to answer specific questions by analyzing a sample and afterwards the solutions can be referenced to see the answers. A cool thing about the solutions is that there are short and long versions. The short versions only provide the answers while the long version walks the reader through the analysis demonstrating how the answers were obtained. The combination of the content, labs, samples, and solutions makes PMA a great self training resource.

PMA contains so much information it’s one of those books where people can keep going back to review specific chapters. I can see myself going back over numerous chapters and redoing the labs as a way to train myself on malware analysis techniques. PMA is not only a great reference to have available when faced with malware but it’s even a greater training resource to have regular access to.

Extending Test Capabilities


The process and techniques described in PMA can be used for other analysis besides understanding malware. A friend of mine who was also reading the book (when I was working my way through it) had to take a look at a program someone in his organization was considering using. Part of his research into the program was to treat it like malware and he used out some of the techniques described in PMA. It was very enlighten the information he learned about the program by incorporating malware analysis techniques into his software testing process. I borrowed his idea and started using some PMA techniques as part of my process when evaluating software or software components. I already used it on one project and it helped us identify the networking information we were looking for. The process and tools discussed in the book helped my friend and myself extend our software testing capabilities so it stands to reason it could do the same for others.

Five Star Review


PMA is another book that should be within reaching distance in anyone’s DFIR shop. I went ahead and purchased PMA hoping the book would improve my knowledge and skills when faced with malware. What I ended up with was knowledge, a process and tools I can use to analyze any program I encounter. PMA gets a five star review (5 out of 5).

One area I thought could be improved with PMA was providing more real life examples. It would have been helpful if the authors shared more of their real life experiences about analyzing malware or how the information obtained from malware analysis helped when responding to an incident. I think sharing past experiences is a great way to provide more context since it lets people see how someone else approached something.

Cleaning Out the Linkz Hopper

Wednesday, April 25, 2012 Posted by Corey Harrell 4 comments
Volume Shadow Copies has been my main focus on the blog for the past few months. I took the time needed to share my research because I wanted to be thorough so others could use the information. As a result, the interesting linkz I’ve been coming across have piled up in my hopper. In this Linkz post I’m cleaning out the hopper. There are linkz about: free DFIR e-magazines, volume shadow copies, triage, timeline analysis, malware analysis, malware examinations, Java exploits, and an interesting piece on what you would do without your tools. Phew …. Let’s roll

Into The Boxes has Returned


Into The Boxes is an e-magazine discussing topics related to Digital Forensic and Incident Response. When the magazine was first released a few years ago I saw instant value in something like this for the community. A resource that not only provides excellent technical articles about DFIR but also compliments what is already out there in the community. I really enjoyed the first two editions but a third issue was never released…. That is until now. The ITB project is back up and running as outlined in the post Into The Boxes: Call for Collaboration 0×02 – Second Try.

It looks like ITB won’t be the only free DFIR magazine on the block. Lee Whitfield is starting up another free magazine project called Forensic 4cast Magazine. His magazine will also be discussing topics related to Digital Forensic and Incident Response.

It’s great to see projects like these but they will only be successfully with community support such as feedback and more importantly writing articles. Without support then efforts like these will go to where great ideas go to die. I’m willing to step up to the plate to be a regularly contributor of original content. I’ll be writing for ITB and my first article discusses how to find out how a system was infected after I.T. tried to clean the infection. Cleaning a system makes it harder to answer the question of how but it doesn’t make it impossible. Stay tuned to see what artifacts are left on a cleaned system in an upcoming ITB edition.

RegRipper Plugins Maintenance Perl Script


This link is cool for a few reasons. Sometime ago Cheeky4n6Monkey sent me an email introducing himself and asking if I had any project ideas. I knew who Cheeky was even before his introductory email because I’ve been following his outstanding blog. I thought this was really cool; he is looking to improve his DFIR skills by trying to reach out and help others. He isn’t taking a passive approach waiting for someone to contact him but he is doing the complete opposite. I went over my idea hopper and there was one thing that has been on my to-do list for some time. At times I wanted to review the RegRipper profiles to update the plugins listed. However, I didn’t want to manually review every plugin to determine what the profile was missing. A better approach would be to flag each plugin not listed which would then reduce the number of plugins I had to be manually review. I mentioned the idea to Cheeky and he ran with it. Actually he went warp speed with the idea because he completed the script within just a few days. To learn more about his script and how to use it check out the post Creating a RegRipper Plugins Maintenance Perl Script.

VSC Toolset


The one thing I like about the DFIR community is the people who willingly share information. Sharing information not only educates us all thus making us better at our jobs but it provides opportunities for others to build onto their work. Case in point, I didn’t start from scratch with my Ripping VSCs research since I looked at and built on the work done by Troy Larson, Richard Drinkwater, QCCIS, and Harlan. I was hoping others would take the little research I did and take it another step forward. That is exactly what Jason Hale from Digital Forensics Stream did. Jason put together the VSC Toolset: A GUI Tool for Shadow Copies and even added additional functionality as outlined in the post VSC Toolset Update. The VSC Toolset makes it extremely easy for anyone to rip VSCs and to add additional functionality to the tool. Seriously, it only takes one line in a batch file to extend the tool. Jason lowered the bar for anyone wanting to examine VSCs using this technique.

Triage Script


When I put together the Tr3Secure Data Collection script I was killing two birds with one stone. First and foremost, the script had to work when responding to security incidents. Secondly, the script had to work for training purposes. I built the script using two different books so people could reference them if they had any questions about the tools or the tools’ output. As such, the one limitation with the Tr3Secure Data Collection is it doesn’t work remotely against systems. Michael Ahrendt (from Student of Security) released his Automated Triage Utility and has since updated his program. One capability Automated Triage Utility has is being able to run against remote systems. To see how one organization benefited by Michael’s work check out Ken Johnson (from Random Thoughts of Forensic) post Tools in the Toolbox – Triage. If you are looking for triage scripts to collect data remotely then I wouldn’t overlook Kludge 3.0. The feedback about Kludge in the Win4n6 Yahoo group has been very positive.

HMFT – Yet Another $MFT extractor


Speaking about Triage, Adam over at Hexacon recently released his HMFT tool in the post HMFT – Yet Another $MFT extractor. I was testing out the script and it grabbed an MFT off a live Windows 7 32 bit Ultimate system within a few seconds. One area where I think HMFT will be helpful is in triage scripts. Having the ability to grab a MFT could provide useful filesystem information including the ability to see activity on a system around a specific time of interest. I plan on updating the Tr3secure Data Collection script to incorporate HMFT.

Strings for Malware Analysis


While I’m talking about Adam I then I might as well mention another tool he released. Sometime ago he released the HAPI – API extractor. The tool will identify all the Windows APIs present in a file’s strings. I’ve been working my way through Practical Malware Analysis (except a full review soon) and one of the steps during static analysis is reviewing a file’s strings. Identifying the Windows APIs in strings may give a quick indication about the malware’s functionality and HAPI makes it so much easier to find the APIs. I added the tool to my toolbox and it will be one of the tools I run whenever I’m static analysis against malware.

Need for Analysis on Infected Systems


Harlan recently discussed the need to perform analysis on infected systems as a means to gather actionable intelligence. His first post where this was mentioned was The Need for Analysis in Intelligence-Driven Defense while the second one was Updates and Links. Alright, Harlan made a lot of great points in those both besides the need to analysis infected systems and they are both definitely worth the read. I’ve heard discussions among digital forensic practitioners about performing analysis on infected systems to determine how the infection occurred. A few responses included: it’s too hard, too time consuming, or most of the time you can’t tell how the infection occurred. People see the value in the information learned by performing an examination but there is no follow through by actually doing the exam. It makes me wonder if one of the roadblocks is that people aren’t really sure what they should be looking for since they don’t know what the Attack Vector Artifacts look like.

NTFS INDX Files


Sometime time ago William Ballenthin released his INDXParse script that can be used to examine NTFS INDX files. To get a clearer picture about the forensic significance of INDX files you can check out Chad Tilbury’s post NTFS $I30 Index Attributes: Evidence of Deleted and Overwritten Files in addition to the information provided by William Ballenthin. INDXParse comes with an option to use a bodyfile as the output (-b switch) and this can be used to add the parsed information to a timeline. Don’t forget that next week William Ballenthin is presenting about his INDX files research in a DFIROnline special edition.

Colorized Timeline Template


Rob Lee created a timeline template to automate colorizing a timeline when imported into Excel. His explanation about the template can be found on his post Digital Forensic SIFTing: Colorized Super Timeline Template for Log2timeline Output Files. Template aside, the one thing I like about the information Rob shared is the color coding scheme to group similar artifacts. To name a few: red for program execution, orange for browser usage or yellow for physical location. Using color in a timelines is a great idea and makes it’s easier to see what was occurring on a system with a quick glance.

Checklist to See If A System's Time Was Altered


Rounding out the posts about time is Lee Whitfield’s slide deck Rock Around the Clock. In the presentation, Lee talks about numerous artifacts to check to help determine if the time on the system was altered. After reading his slides over and the information he provided makes a great checklist one could follow if a system’s time comes into question. The next time I need to verify if someone changed the system clock then I’ll follow these steps as outlined by Lee. I copied and pasted my personal checklist so if any information is listed below that didn’t come from Lee’s slide deck then I picked it up from somewhere else.

        - NTFS MFT entry number
                * New files are usually created in sequence. Order files by creation then by identifier. Small discrepancies are normal but large require further investigation

        - Technology Advancement
                * Office, PDF, Exif images, and other items' metadata show program used to create it. Did the program exist at that time?

        - Windows Event Logs
                * Order event logs in order then review the date/time stamps that are out of order
                * XP Event ID 520 in security log "the system time was changed" (off by default) Vista, 7 Event ID 1 in system log "the system time has changed to ..." and event id 4616 in security log "the system time was changed"

        - NTFS Journal
                * Located in the $J stream of $UsnJrnl and may hold few hours or days of data. Entries sequentially stored

        - Link files
                * XP each link file has a sequence number (fileobjectid). Sort by creation date then review sequence number

        - Restore Points
                * XP restore points named sequentially. Sort by creation date then review RP names for out of sequence

        - Volume Shadow Copies
                * VSC GUIDs are similarly named for specific times
                * Sort by creation data and then review the VSC names to identify ones out of place

        - Web pages (forums, blogs, or news/sports sites)
                * Cached web pages may have date/time

        - Email header

        - Thumbnails
               * XP one repository for each folder and Vista/7 one for all folders. Both store items sequentially.
               * Sort by file offsets order then review for out of place dates

Attackers Are Beating Java Like a Red Headed Stepchild


I don’t have much narration about Java exploits since I plan on blogging about a few case experiences involving it. I had these links under my exploits category and wanted to get rid of them so I can start fresh. Towards the end of last year a new Java vulnerability was being targeted and numerous attacks started going after it. DarkReading touched on this in the article The Dark Side Of Java and Brian Krebs did as well in the post New Java Attack Rolled Into Exploit Kits. The one interesting thing about the new Java attack from the DFIR perspective is it looks the same on a system as other Java exploits going after different vulnerabilities. It’s still good to be informed about what methods the attackers are using. Another link about Java was over at the Zscaler Threatlab blog. There’s an excellent write-up showing how a Java Drive-by Attack looks from the packet capture perspective.

What Can You Do Without Your Tools


The Security Shoggoth blog's post Tools and News provided some food for thought. The post goes into more depth on the author’s tweet: Want to find out how good someone is? Take away all their tools and say, "Now do it.". When I first got started in DFIR I wanted to know the commercial tool I had available inside and out. I learned as much as I could about the tool except learning how to write enscripts. Then one day I thought to myself, could I do forensics for another shop if they don’t have Encase and the answer was unfortunately no. I think there are a lot of people in our field who fall into the on commercial tool boat. They can do wonders with their one tool but if they don’t have access to it or if the tool can’t do something then they get stuck. I made the decision to improve my knowledge and skills so I could do my job regardless of the tools I had available. The change didn’t happen overnight and it took dedication to learn how to do my job using various tools for each activity. Try to answer two of the questions the author mentioned in his post and if you are unable to fully answer them then at least you know an area needing improvement.

Imagine for a moment that you didn't have the tool(s) you use most in your job - how would you perform your job? What alternatives are available to you and how familiar you are with them?

Improvise Adapt Overcome

Tuesday, April 10, 2012 Posted by Corey Harrell 3 comments

Everybody has a story about how they became involved in DFIR. Showing the different avenues people took to reach the same point can be helpful to others trying to break into the field. I’ve been thinking about my journey and the path that lead me to become the forensicator who I am today. This is my story …

My story doesn’t start with me getting picked up by another DFIR team, being shown the reins by an experienced forensicator, or being educated in a digital forensic focused curriculum. My story starts many years ago when I took the oath and became a United States Marine. The Marines instilled into me the motto: improvise, adapt, and overcome. When I was in the Marines, I didn’t get the newest equipment, the latest tools, or other fancy gadgets. Things happen and it was not always the best of circumstances but I had to make do with what I had by improvising, adapting, and overcoming. This motto was taught to me when I first entered the Corps. Gradually it became a part of who I was; it became second nature when I was faced with any kind of adversity. Reflecting back on my journey I can easily see I ended up in DFIR by improvising, adapting, and overcoming the various situations I found myself in. Before I discuss those situations I think it’s necessary to define what exactly the Marines’ motto means:


jIIr (Star Wars Character)

Improvise: leverage the knowledge and resources available. You need to be creative to solve the situation you are going through.

Adapt: adjust to whatever situation being faced. Whether if its things not going as planned, lack of resources, issues with employment, or just adversity while doing your job. Whatever happens you need to make adjustments and adapt to the situation at hand.

Overcome: prevail over the situation. With each situation conquered you come out more knowledgeable and in a better position to handle future adversity.

Did I Take the Wrong Job


I was first exposed to the information security field in my undergraduate coursework and the field captivated my interest. However, at the time security jobs in my area were scarce so I opted to go into I.T. One of my first jobs after I graduated was not the most ideal conditions. I picked up on this on my first day on the job. A few hours were spent showing me the building locations throughout the city, introducing me to a few people, and pointing out my desk. That was it; there was no guidance on what was expected of me, explaining the network, training, etc. In addition, hardly any resources were provided to us to do our jobs. To illustrate, we needed some basic equipment (cabling, crimpers, connectors, …) so I did research and identified the most cost effective equipment which came in around $300. My purchase request was denied and then I narrowed the equipment down to the bare minimum for about a cost of $70. This was still denied since it was $70 too much. This lack of support went across the board for everything in our office. You were asked to do so many things but virtually no support was provided to make you successful. As I mentioned before, this was not the most ideal working condition.

I adapted to the environment by dedicating my own resources to improve myself by increasing my skillset and knowledge. I didn’t have access to a budget so I learned how to use free and open source software to get the job done. I couldn’t rely on any outside help so I used my problem solving skills to find my own answers to problems or coming up with my own solutions. Within a short period of time I went from questioning my decision to take the job to becoming the one managing the entire Windows network. I had the flexibility to try and do what I wanted on the network. I even used the position to increase my security skills by learning how to secure the Windows network. In the end the job became one of the best places I worked at and my knowledge grew by leaps and bounds.

Landed My First InfoSec Gig


The way I improvised, adapted, and overcame the issue I faced at a previous employer helped me land my first information security position. I joined a network security unit within an organization’s auditing department. My initial expectation was to bring my technical expertise to the table to help perform security assessments against other New York State agencies. My first week on the job I encountered my first difficulty. The other technical person I was supposed to work with resigned and his last week was my first week. My other co-worker was an auditor so I didn’t have a technical person to bring me up to speed on what I needed to do. Adapting to this situation was easier because of the resources my organization provided me. I had at my disposal: books, Internet, a test network, servers, clients, great supervisors, access to previous completed work, and time. In addition to these resources, I drew on my years of experience in IT and the information security knowledge I gained in my Windows admin days. Over time I increased my knowledge about information security (at management and technical levels) and I honed my skills in performing security assessments. On my first engagement where I helped come up with the testing methodology against an organization we were highly successfully. Within an extremely short period of time we had full control over their network and the data stored on it.

Welcome to DFIR


As I said I’m in a security unit within an auditing department. One activity other units in my department perform is conducting fraud audits. As a result, at times auditors need assistance with not only extracting electronic information from networks but help in validating if and how a fraud is occurring. I was tasked with setting up a digital forensic process to support these auditors even though I didn’t have any prior experience. I accepted the challenge but I didn’t take it lightly because I understood the need to do forensics properly. I first drew on my previous experience in evidence handling I gained when I managed the video cameras not only mounted in vehicles but scattered throughout the city. I even reached out to a friend who was a LE forensicator in addition to using the other resources I had available (training, books, Internet, test network, and time). I overcame the issue of setting up a digital forensic process from scratch. I established a process that went from supporting just my department to numerous departments within my organization. A process capable of processing cases ranging from fraud to investigations to a sprinkle of security incidents.

Improvise – Adapt – Overcome


The Marines instilled in me how to overcome adversity in any type of situation. This mentality stayed with me as I moved onto to other things in life and it was a contributing factor to how I ended up working DFIR. Whenever you are faced with adversity just remember Gunny Highway’s words:


Forensic4cast Awards


Forensic4Cast released the 2012 award nominees. I was honored to see my name listed among the nominees (blog of the year and examiner of the year). I am in outstanding company with Melia Kelley (Girl, Unallocated) and Eric Huber (A Fistful of Dongles) both of which are outstanding blogs. For Examiner of the Year I’m accompanied with Kristinn Gudjonsson (log2timeline literal changed how I approach timelines) and Cindy Murphy whose everyday efforts are improving our field. Both of these individuals are very deserving of this award. It’s humbling to see my work reflected in the Forensic4Cast awards especially since it was only about four years ago when my supervisor’s simple request launched me into the DFIR community. I wanted to say thank you to those who nominated me and wanted to encourage anyone who hasn’t voted for any of the nominees to do so. People have put in a lot of their own time and resources to improve our community and they deserve to be recognized for their efforts.
Labels:

Tale as Old as Time: Don’t Talk To Strangers

Sunday, April 1, 2012 Posted by Corey Harrell 3 comments
I was enjoying my Saturday afternoon doing various things around the house. My phone started ringing the caller ID showed it was from out of the area. I usually ignore these types of calls, but I answered this time because I didn’t want the ringing to wake my boys up from their nap. Dealing with a telemarketer is a lot easier than two sleep deprived kids.

Initially when I answered there was a few seconds of silence---then the line started ringing. My thought was “wait a minute, who is calling who here.” A female voice with a heavy accent picked up the phone; I immediately got flashbacks from my days dealing with foreign call centers when I worked in technical support. Then our conversation started:

Me: “Hello”
Female Stranger: “Is this Corey Harrell?”
Me: “Yes … who’s calling?”
Female Stranger: “This is Christina from Microsoft Software Maintenance Department calling about an issue with your computer. Viruses can be installed on computers without you knowing about it.”
Me: “What company are you with again?”
Female Stranger said something that sounded like “Esolvint”
Me in a very concerned tone: “Are you saying people can infect my computer without me even knowing it?”
Female Stranger: “Yes and your computer is infected.”

I knew immediately this was a telephone technical support scam, but I stayed on the line and pretended I knew nothing because I wanted to get first-hand experience about how these criminals operate. Conversation continued:

Female Stranger: “Are you at your computer?”
Me: “Yes”
Female Stranger: “Can you click the Start button then Run”
Me: “Okay …. The Start button then what? Something called Run”
Female Stranger: “What do you see?"
Me: “A box”
Female Stranger: “What kind of box”
Me: “A box that says Open With”
Female Stranger: “What do you see in the Open With path?”
Me: “Nothing” (At this point I had to withhold what I saw because then she might be on to me.)
Female Stranger: “You need to open the Event viewer to see your computer is infected”
Female Stranger: “Can you type in e-v-e-n-t-v-w-r”
Me: “I just typed in e-v-e-n-t-v-w-r”
Female Stranger: “Can you spell what is showing in the Open with path”
Me: “Eventvwr”
Female Stranger: “Can you spell what is showing in the Open with path”

The Female Stranger was taking too long to get to her point. I knew she was trying to get me to locate an error…any kind of error on my computer…to convince me my computer was infected and then from there she would walk me through steps to either give her remote access to my computer, actually infect my computer with a real virus or try to get my credit card information. I ran out of patience and changed the tone of the conversation.

Me: “Why are you trying to get me to access the Windows event viewer if you are saying I’m infected? The only thing in the Event viewer showing my computer was infected would be from an antivirus program but my computer doesn’t have any installed. The event viewer won’t show that my computer is infected”
Female Stranger sticking to the script: “You need to access the event viewer ….”
Me (as I rudely cut her off): “You can stop following your script now”
Female Stranger: complete silence
Me: “I know your scam and I know you are trying to get me to either infect my computer or give you remote access to my computer….”


She then hung up. I believe she knew I was on to her. It’s unfortunate since I wish she had heard everything I had to say about how I feel about people like her who try to take advantage of others. My guess is she wouldn’t care and just moved onto the next potential victim. Could that victim be you?


I’m sharing this cautionary tale so others remember the tale as old as time…”Don’t Talk To Strangers.” Especially when it comes to your private information….especially in the cyber world. Companies will not call you about some issue with your computer. Technical support will not contact you out of the blue knowing your computer is infected (unless it’s your help desk at work). Heck … even your neighborhood Geek won’t call you knowing there is something wrong with your computer.


If someone does then it’s a scam. Plain and simple some criminal is trying to trick you into giving them something. It might be to get you to infect your computer, give them access to your computer, or provide them with your credit card information. The next time you pick up a phone and someone on the other end says there is an issue with your computer let your spidey sense kick in and HANG UP.


Information about this type of scam is discussed in more detail at:


* Microsoft’s article Avoid Tech Support Phone Scams


* Sophos’ article Canadians Increasingly Defrauded by Fake Tech Support Phone Calls


* The Guardian’s article Virus Phone Scam Being Run from Call Centers in India



Updated links courtesy of Claus from Grand Stream Dreams:

Troy Hunt's Scamming the scammers – catching the virus call centre scammers red-handed

Troy Hunt's Anatomy of a virus call centre scam


I reposted my Everyday Cyber Security Facebook page article about my experience to reach a broader audience to warn others. The writing style is drastically different then what my blog readers are accustomed. My wife even edits the articles to make sure they are understandable and useful to the average person.
Labels:

Volume Shadow Copy Timeline

Sunday, March 25, 2012 Posted by Corey Harrell 2 comments
Windows 7 has various artifacts available to help provide context about files on a system. In previous posts I illustrated how the information contained in jump lists, link files, and Word documents helped explain how a specific document was created. The first post was Microsoft Word Jump List Tidbit where I touched on how Microsoft Word jump lists contain more information than the documents accessed because there were references to templates and images. I expanded on the information available in Word jump lists in my presentation Ripping VSCs – Tracking User Activity. In addition to jump list information I included data parsed from link files, documents’ metadata, and the documents’ content. The end result was that these three artifacts were able to show –at a high level - how a Word document inside a Volume Shadow Copy (VSC) was created. System timelines are a great technique to see how something came about on a system but I didn’t create one for my fake fraud case study. That is until now.

Timelines are a valuable technique to help better understand the data we see on a system. The ways in how timelines are used is limitless but the one commonality is providing context around an artifact or file. In my fake fraud case I outlined the information I extracted from VSC 12 to show how a document was created. Here’s a quick summary of the user’s actions: document was created with bluebckground_finance_charge.dotx template, Microsoft Word accessed a Staples icon, and document was saved. Despite the wealth of information extracted about the document, there were still some unanswered questions. Where did the Staples image come from? What else was the user doing when the document was being created? These are just two questions a timeline can help answer.

The Document of Interest


Creating VSC Timelines


Ripping VSCs is a useful technique to examine VSCs copies but I don’t foresee using it for timeline creation. Timelines can contain a wealth of information from one image or VSC so extracting data across all VSCs to incorporate into a timeline would be way too much information. The approach I take with timelines is to initially include the artifacts that will help me accomplish my goals. If I see anything when working my timeline I can always add other artifacts but starting out I prefer to limit the amount of stuff I need to look at. (For more about how I approach timelines check out the post Building Timelines – Thought Process Behind It). I wanted to know more about the fraudulent document I located in VSC 12 so I narrowed my timeline data to just that VSC. I created the timeline using the following five steps:

        1. Access VSCs
        2. Setup Custom Log2timeline Plug-in Files
        3. Create Timeline with Artifacts Information
        4. Create Bodyfile with Filesystem Metadata
        5. Add Filesystem Metadata to Timeline

Access VSCs


In previous posts I went into detail about how to access VSCs and I even provided references about how others access VSCs (one post was Ripping Volume Shadow Copies – Introduction). I won’t rehash the same information but I didn’t want to omit this step. I identified my VSC of interest was still numbered 12 and then I created a symbolic link named C:\vsc12 pointing to the VSC.

Setup Custom Log2timeline Plug-in Files


Log2timeline has the ability to use plug-in files so numerous plug-ins can run at the same time. I usually create custom plug-in files since I can specify the exact artifacts I want in my timeline. I setup one plug-in file to parse the artifacts located inside a specific user profile while a second plug-in file parses artifacts located throughout the system. I discussed in more depth how to create custom plug-in files in the post Building Timelines – Tools Usage. However, a quick way to create a custom file is to just copy and edit one of the built-in plug-in files. For my timeline I did the following on my Windows system to setup my two custom plug-in files.

        - Browsed to the folder C:\Perl\lib\Log2t\input. This is the folder where log2timeline stores the input modules including plug-in files.

        - Made two copies of the win7.lst plug-in file. I renamed one file to win7_user.lst and the other to win7_system.lst (the files can be named anything you want).

        - Modified the win7_user.lst to only contain iehistory and win_link to parse Internet Explorer browser history and Windows link files respectfully.

        - Modified the win7_system.lst to only contain the following: oxml, prefetch, and recycler. These plug-ins parse Microsoft Office 2007 metadata, prefetch files, and the recycle bin.

Create Timeline with Artifacts Information


The main reason why I use custom plug-in files is to limit the amount of log2timeline commands I need to run. I could have skipped the previous step which would have caused me to run five commands instead of the following two:

        - log2timeline.pl -f win7_user -r -v -w timeline.csv -Z UTC C:/vsc12/Users/harrell

        - log2timeline.pl -f win7_system -r -v -w timeline.csv -Z UTC C:/vsc12

The first command ran the custom plug-in file win7_user (-f switch) to recursively (-r switch) parse the IE browser history and link files inside the harrell user profile. The Users folder inside VSC 12 had three different user profiles so pointing log2timeline at the one let me avoid adding unnecessary data from the other user profiles. The second command ran the win7_system plug–in file to recursively parse 2007 Office metadata, prefetch files, and recycle bins inside VSC 12. Both log2timeline commands stored the output in the file timeline.csv in UTC format.

Create Bodyfile with Filesystem Metadata


At this point my timeline was created and it contained timeline information from select artifacts inside VSC 12. The last item to add to the timeline is data from the filesystem. Rob Lee discussed in his post Shadow Timelines And Other VolumeShadowCopy Digital Forensics Techniques with the Sleuthkit on Windows how to use the sleuthkit (fls.exe) to create a bodyfiles from VSCs. I used the method discussed in his post to execute fls.exe directly against VSC 12 as shown below.

        - fls -r -m C: \\.\HarddiskVolumeShadowCopy12 >> bodyfile

The command made fls.exe recursively (-r switch) search VSC 12 for filesystem information and the output was redirected to a text file named bodyfile in mactime (-m switch) format.

Add Filesystem Metadata to Timeline


The timeline generated by Log2timeline is in csv format while the sleuthkit bodyfile is in mactime format. These two file formats are not compatible so I opted to convert the mactime bodyfile into the Log2timeline csv format. I did the conversion with the following command:

        - log2timeline.pl -f mactime -w timeline.csv -Z UTC bodyfile

Reviewing the Timeline


The timeline I created included the following information: filesystem metadata, Office documents’ metadata, IE browser history, prefetch files, link files, and recycle bin information. I manually included the information inside Microsoft Word’s jump list since I didn’t have the time to put together a script to automate it. The timeline provided more context about the fraudulent document I located as can be seen in the summary below.

1. Microsoft Word was opened to create the Invoice-#233-staples-Office_Supplies.docx (Office metadata)

2. BlueBackground_Finance_Charge.dotx Word template was created on the system (filesystem)

3. User account accessed the template (link files)

4. Microsoft Word accessed the template (jump lists)

5. User performed a Google search for staple (web history)

6. User visited Staples.com (web history)

7. User accessed the staples.png located in C:/Drivers/video/images/ (link files)

8. The staples.png image was created in the images folder (filesystem)

9. Microsoft Word accessed the staples.png image (jump lists)

10. User continued accessing numerous web pages on Staples.com

11. Microsoft Word document Invoice-#233-staples-Office_Supplies.docx was created on the system (office metadata and filesystem)

12. User accessed the Invoice-#233-staples-Office_Supplies.docx document (link files and jump lists)


Here are the screenshots showing the activity I summarized above.













Second Look at Prefetch Files

Monday, March 19, 2012 Posted by Corey Harrell 1 comments
The one thing I like about sharing is when someone opens your eyes about additional information in an artifact you frequently encounter. Harlan has been posting about prefetch files and the information he shared changed how I look at this artifact. Harlan’s first post Prefetch Analysis, Revisited discussed how the artifact contains strings -such as file names and full paths to modules that were either used or accessed by the executable. He also discussed how the data can not only provide information about what occurred on the system but it could be used in data reduction techniques. One data reduction referenced was searching on the file paths for words such as temp. Harlan’s second post was Prefetch Analysis, Revisited...Again... and he expanded on what information is inside prefetch files. He broke down what was inside a prefetch from one of my test systems where I ran Metasploit against a Java vulnerability. His analysis provided more context to what I found on the system and validated some of my findings by showing Java did in fact access the logs I identified. Needless to say, his two posts opened my files to additional information inside prefetch files. Additional information I didn’t see the first the first time through but now I’m taking a second look to see what I find and to test out how one of Harlan's data reduction techniques would have made things easier for me.

Validating Findings

I did a lot of posts about Java exploit artifacts but Harlan did an outstanding job breaking down what was inside one of those Java prefetch files. I still have images from other exploit artifact testing so I took a look at prefetch files from an Adobe exploit and Windows Help Center exploit. The Internet Explorer prefetch files in both images didn’t contain any references to the attack artifacts but the exploited applications’ prefetch files did.

The CVE-2010-2883 (PDF Cooltype) vulnerability is present in the cooltype.dll affecting certain Adobe Reader and Acrobat versions. My previous analysis identified the following: the system had a vulnerable Adobe reader version, a PDF exploit appeared on the system, the PDF exploit is accessed, and Adobe Reader executed. The strings in the ACRORD32.EXE-3A1F13AE.pf prefetch file helped to validate the attack because it shows that Adobe Reader did in fact access the cooltype.dll as shown below.

\DEVICE\HARDDISKVOLUME1\PROGRAM FILES\ADOBE\READER 9.0\READER\COOLTYPE.DLL

The prefetch file from the Windows Help Center URL Validation vulnerability system showed something similar to the cooltype.dll exploit. The Seclists Full disclosure author mentioned that Windows Media Player could be used in an attack against the Help Center vulnerability. The strings in the HELPCTR.EXE-3862B6F5.pf prefetch file showed the application did access a Windows Media Player folder during the exploit.

\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\ADMINISTRATOR\LOCAL SETTINGS\APPLICATION DATA\MICROSOFT\MEDIA PLAYER\

Finding Malware Faster

Prefetch files provided more information about the exploit artifacts left on a system. By itself this is valuable enough but another point Harlan mentioned was using the strings inside prefetch files for data reduction. One data reduction technique is to filter on files' paths. To demonstrate the technique and how effective it is at locating malware I ran strings across the prefetch folder in the image from the post Examining IRS Notification Letter SPAM. (note, strings is not the best tool to analyze prefetch files and I’m only using the tool to illustrate how data is reduced) I first ran the following command which resulted in 7,905 lines.

strings.exe –o irs-spam-email\prefetch\*.pf

I wanted to reduce the data by only showing the lines containing the word temp to see if anything launched from a temp folder. To accomplish this I ran grep against the strings output which reduced my data to 84 lines (the grep -w switch matches on whole word and –i ignores case).

strings.exe –o irs-spam-email\prefetch\*.pf | grep –w –i temp

The number of lines went from 7,905 down to 84 which made it fairly easy for me to spot the following interesting lines.

\DEVICE\HARDDISKVOLUME1\DOCUME~1\ADMINI~1\LOCALS~1\TEMP\TEMPORARY DIRECTORY 1 FOR IRS%20DOCUMENT[1].ZIP\IRS DOCUMENT.EXE

\DEVICE\HARDDISKVOLUME1\DOCUME~1\ADMINI~1\LOCALS~1\TEMP\PUSK3.EXE

Using one filtering technique enabled me to quickly spot interesting executables in addition to the possibly finding the initial infection vector (a malicious zip file). This information was obtained by running only one command against the files inside a prefetch folder. In hindsight, my original analysis on the prefetch files was fairly limited (executable paths, runcounts, and filenames) but going forward I'll look at this artifact and the information they contain in a different light.