Tuesday, February 16, 2016

PowerStash: Accelerating the Hunt with Elastic

Ever since I was a PowerShell Padawan, I've been told that proper output comes in the form of an object. But what do you do with those objects? You could write them to disk as comma-separated-values with the built-in Export-Csv cmdlet, or better yet as XML via Export-Clixml. However, when hunting in an enterprise network, collecting potentially millions of forensic artifacts, the scalability of CSV and XML diminishes quickly. What you really need, Chancho, is a pair of stretchy pants.

You've probably heard of the ELK stack: Elastic (formerly Elasticsearch), Logstash, and Kibana. Elastic being the NoSQL search engine, Logstash handling data insertion, and Kibana providing the web-based UI. As the title of this post entails, I've taken a PowerShell approach to replacing Logstash's functionality. While Logstash is great, it does have limitations. For example, to collect all of the event logs from a Windows domain you'd need to install a log forwarder on each machine to send their event logs to Logstash. Thankfully, PowerShell's built-in cmdlets for retrieving data coupled with Elastic's open-source nature has made the marriage of these two capabilities fairly straightforward. Not only that, but with PowerShell we can get so much more than event logs. Let's get started.

The first, and most important, thing is to configure an Elastic instance capable of handling the amount of data we're going to throw at it. There are plenty of write-ups on how to tune Elastic, so I won't write one here; I took my pointers directly from Brad Lhotsky's outline for a write-heavy configuration. Next we need to code, wait I already did that part (or at least started it) for you. Realistically, I've written a light binding for Elasticsearch.Net, but it is more than capable of stashing all the things. Please feel free to contribute to the project and expand the binding I've started.

Elasticsearch.Net provides access to all of Elastic's APIs, primarily in the form of a client that knows how to interact with an Elastic instance. All that really needs to be done is to instantiate a client that matches your Elastic configuration. There are a couple ways to do this with PowerStash via New-ElasticClient. Let's start simple.
For a more complex configuration, start by creating a connection pool, which feeds into a connection configuration, and finally into a fully configured client.

To index a lot of information quickly, Elastic's Bulk API is paramount and used exclusively in PowerStash since it can index a single object/document, or many. To use the Bulk API, object's being indexed into Elastic need to be converted into the proper format using New-BulkIndexRequest. Conversion is a pretty short order, thanks to PowerShell's ConvertTo-Json cmdlet, as illustrated below.

Weaving these pieces together results in a scalable replacement for writing objects to disk with Export-Csv/Clixml... Export-Elastic. Which I've written to work similarly to the aforementioned exporters.

For objects to be indexed correctly they need to have a few specific properties. Elastic has date/time formats that must be used; so if your objects have any [DateTime] properties, they need to be converted. Additionally, every object being indexed needs to have a uuid "Id" property; in Windows that's a guid. To let PowerStash handle the indexing, every object also needs to have a DateCreated property and a TypeName. I've applied these changes to the Win32_NTLogEvent object below.

One of the awesome things about Elastic is its type inference, but in testing I found that it's best to create an index template so Elastic knows explicitly what your objects' types are. An index template is a simple JSON representation that needs to be sent to Elastic, as follows.

The only thing left to do is ramp up the data collection for stress-testing. I've gotten pretty good throughput, ~18k objects per minute, by running the collection in a separate runspace and exporting objects to Elastic on the main thread as data is returned. I've provided the functionality to do this via Invoke-Powerstash. I'd love for you to test this for yourself and report back how well it worked, or if it's failing. Combining PowerStash with a tool like CimSweep has the potential to bring enterprise hunting to a manageable level.

No comments:

Post a Comment