Tuesday, February 12, 2013

Automating OpenIOC with Splunk

This year a colleague of mine, @trakzon, and I talked about integrating MIR, OpenIOC and other network data sources such as BRO and Palo Alto Network Firewalls logs with Splunk at MIRCon. We never got around to publishing the code until now.   Here is the code to automate the searching of OpenIOC format IOCs through Splunk’s API:

As always we welcome feedback and features. 

Tuesday, February 5, 2013

Auto-Generating OpenIOCs

Wow, it has been two years since last post, and this post will not be mind blowing. For the past couple of years I have been busy building an incident response team and SOC so I have not had a lot of time write much worth posting. I finally got some free time this last week and created a quick and dirty script to create OpenIOC format IOCs from unstructured data. This was a pain point in my teams day to day operations since we use OpenIOC format IOCs for just about every part of our automation to look for known indicators, be in in Splunk or MIR. The script is not complicated but saves a ton of time when you have to import a long list of IOCs (MD5’s, IP’s or domain names). To execute the script all you need is a basic install of python, no other libraries. I’ll walk through an example of its use below. First step is to find a source for IOCs. In a perfect world all IOCs would be in a structured format already but most of the time they are not. In this example I will be creating IOC’s from a a FireEye blog post. If you want to play along here is the site: http://blog.fireeye.com/research/2013/02/operation-beebus.html Just copy the contents the blog post to a file. Once you have the contents copied over execute the script as below, “blobofIOCs” is the copied content:

This will create an OpenIOC format IOC in the same directory:

From here you aren’t done it does most of the work but you have to edit some of the property files and check for BAD IOCs. You can load the IOC up in Mandiant IOC editor located here:

When parsed it will look similar to the figure below:

“ioc_creator.py” will generate domain, IP, MD5 hash, registry and file path IOCs. Be sure to change the name, add groups, and change the description as it defaults to BulkImport . The IOC will still load into MIR if you don’t change this but you want to change this data for context if it ever hits. The IOC also has some bad auto-generated IOCs. The “File Full Path contains “\Windows,” which is a bad IOC, as well as the single registry path contains.  Delete these BAD IOCs out. “ioc_creator.py” defaults IOC term logic to the following for each IOC term type:
  • FileItem/FullPath – contains
  •  RegistryPath – contains
  • Network/DNS – contains
  • PortItem/remoteIP – is
  • FileItem/Md5sum - is
The auto-generation will catch some very high false positive IOCs so make sure to use IOC lint and double check what has been auto-generated for you.  In the end my IOC was cleaned up to look like the following figure:

Please let me know of any bugs or features needed. The script can be found here:

Monday, February 14, 2011

Good Forensics, RE, Pen Testing, Network RE Training at Hole in the Wall Prices

If your budget for training is tight this year you might want to check out a training event I will be helping out with this year called Tracer FIRE III. The link for the site is below:


I will be co-teaching the forensics class and the other instructors are experts in their area as well. The class I will be teaching is focused on live intrusion forensics so you can leave our write blockers at home.

Monday, October 18, 2010

Forensic Crash Dump Analysis

I attended and spoke at Mandiant's MIRCon last week. It was a really good conference, not even counting that it was free. I have uploaded by slides from my talk and they can be downloaded here:

Forensic Crash Dump Analysis

I probably skipped over a lot of the details and did not leave people enough time to write down all the good registry and windbg tidbits. I am looking forward to the conference next year. I am working on getting the MIR scripts and shell scripts released and will be posted on this blog when they are released.

Monday, August 16, 2010

Good Malware Blogs and Job Posting

I just got done with my morning blog roll reading and wanted to link to a couple of good entries. The first was by Zynamics talking about creating better malware signatures:


It specifically talks about creating signatures with their product VxClass. If you have not had a chance to use or see this product I suggest you do. The second blog entry worth re-posting was one from Nick Harbour at Mandiant. His post talks about finding command and control functions in malware, specifically focusing on the COM point of view, here is a link to the post:


Finally I would like to advertise a couple of positions that are open at my current work place. My team is looking for forensicators, incident responders, red teamers and malware analysts. If you are interested in the job apply at the link below:


Tuesday, August 10, 2010

$MFT Parsing EnScript

There is a considerable amount of forensic goodness in the $MFT on NTFS partitioned disks. What is a $MFT? Well the $MFT is the master file table on NTFS partitions that is a kind of database that keeps track of all the files on the partition including its location and metadata about the file. I am not going to delve into the depths of the format of NTFS because it has already been explained in numerous books like File System Forensics by Brain Carrier. What I am going to do is quickly summarize the "goodness" available in the $MFT and how you can extract this data with a EnScript I have authored.

The $MFT contains an entry for every file and directory on a partition including itself. Important metadata with in a $MFT are the name of the file, inode number, standard information attribute, file name attributes and data attribute. The size of a $MFT record is 1024 bytes. Below is an example of a $MFT record:

The header of a $MFT record is "FILE0". A $MFT record entry can contain the contents of a file if the size of the file fits into the allotted size for data attribute. Below is a $MFT record broken down into what I think are its important parts.

The standard information attribute (SIA) will contain the file times most people are used to seeing on the file system such as created, last written, last accessed and last modified. These times can also be easily changed by the attacker, the "worst" forensic problem ever until now because now you will know about the file information attribute (FIA). The FIA stores dates associated with the file's name and parent directory. These dates cannot be altered using Windows API calls like the SIA can. The meta data kept in the file information attribute consists of the file name creation date, file name modified date, file name last written and file name last accessed. Comparing the SIA to the FIA can detect the dreaded timestomping but be aware of the number of times this happens on a non-compromised or purposely altered system so stick to time windows when doing this type of analysis. To find all these times you are going to have to parse them by hand, wait no you don't there's a script for that below:


The EnScript I created, inspired by Keith Gould, will parse the important information mentioned above for you and provide you with a tab separated text file you can open up in Excel or parse with your favorite awk command. Don't worry I am not forgetting about the $MFT slack section mentioned above in the important parts, I will go over it in the next post.

Wednesday, June 30, 2010

Page Files

I have not posted in a long time....seems to be a bit of a pattern but I should be posting more in the next couple of months with some new EnScripts. Before those posts though I wanted to point people to a great blog on why acquiring page files the same time as memory is a more complex problem than you would think.