Super Timeline Using ELK Stack

Spread the love

 

ELK Stack is a collection of three components – Elasticsearch, Logstash & Kibana

  • Logstash – This component is responsible for processing incoming data. It takes input from different sources, executes different transformations and stores the results in Elasticsearch or other formats
  • Elasticsearch – NoSQL database based on Apache Lucene’s search engine.
  • Kibana – Web based interface that is used to search and visualize logs

These three together form the ELK stack and are used largely now in Threat Hunting or Big Data Security Analytics for the sole role of log analytics and viz. ELK stack would be the open source alternative to Splunk

ELK can also be used for performing analytics and timelining of a forensic image

For this demo, I am using an Encase image of a Windows XP drive taken from ForensicKB

 

MOUNT the Image

In FTK Imager,

  • Select Image Mounting.
  • Select the Image file.
  • Change Mounting method to Block Device / Read Only.
  • Choose the drive letter.
  • Hit Mount.

Here I have mounted the WinXP2.E01 image as E:

 

Convert the IMAGE into a PLASO file

Plaso is the python based backend for the Log2Timeline Tool.

Plaso will allow us to extract information from the files and to create a plaso based SuperTimeline file.

This file can then be imported into ELK to perform analytics

You can download plaso 1.5.1 from here.

Log2Timeline will then parse the entire image and create a file named XP.plaso

Detailed help on its GitHub page

 

 

Parse the PLASO file into ELK

I transferred the file to my ELK VM.

Before I run psort, I need to ensure Python is installed along with the pyelasticsearch libraries
We can install the libraries using pip

psort is a tool that allows us to post-process plaso files and also to perform sorting & filtering on the files
We can see detailed help by using -h or --help

To see a list of supported outputs, use the -o list parameter
To specify the timezone using the -z TIMEZONE parameter
To view the analysis plugins, use the --analysis list switch

 

Using KIBANA for Analytics

Load the Kibana Web Interface (localhost:5601) and create an index matching the index_name specified above. Map the time-field name with datetime.

If we expand an event in Kibana, we can see there are different fields like

file_entry_type
filename
is_allocated
sha256_hash

On filtering for reset5, we can see there are some .exe and .dat files.

The SHA256 value can be submitted to Virus Total

 

Command Line summary

 

 

 

 

 

 

About the author

Lionel Faleiro

Photographer | Ex-Univ Professor | SysAdmin | Trainer, Security Analyst & DFIR Enthusiast |

View all posts