Acceleration of Bloodhound - automation of Neo4j queries - TEAL Technology Consulting GmbH
2462
post-template-default,single,single-post,postid-2462,single-format-standard,bridge-core-2.4.8,,qode-title-hidden,qode-child-theme-ver-1.0.0,qode-theme-ver-23.3,qode-theme-bridge,disabled_footer_top,qode_header_in_grid,wpb-js-composer js-comp-ver-6.4.1,vc_responsive

Acceleration of Bloodhound – automation of Neo4j queries

This month we would like to give something back to the community. As you have read in our November blog, we use numerous open source tools for our Active Directory Assessment. A prominent example is Bloodhound with the Neo4j graph database in the background. Bloodhound is a tool that allows an attacker to “scan” the Active Directory with user rights and then evaluate the data offline. All groups, users and computer objects are stored, but above all also login data (who has a session where) and authorizations. These can be evaluated with special queries and potential targets and paths of attack are shown.

In addition to the visualization of the attack paths in Bloodhound, there are numerous queries that can only be used directly against Neo4J, because the result is not a path but a list. In the net there are already numerous pages with interesting queries, e.g. from @Haus3c or from the Bloodhound authors (@wald0 and @CptJesus) themselves. From the latter there is also a good article on how to write your own Cypher queries (that’s the name of the Neo4j language).

In the context of our AD assessment we always execute the same basic set of queries together with customer specific queries and hand the result over to the customer in CSV format as part of the assessment documentation. We have automated exactly this step and publish the script together with this blog article 😊.

Das script works as follows:

The script reads one or more Cypher queries and their titles from an input file in JSON format, submits the queries to the Neo4j API one by one with the stored credentials and writes the result to a CSV file in the same directory.

The API URL, input file, username, password and output directory can be passed as parameters.

Please keep one thing in mind: If a query runs too long and you abort the PowerShell script, the processing of the query will continue within Neo4j. So you have to restart the Neo4j service to abort the query.

And this is how it looks like:

Script and inputfile

Queryfile

Command without parameter

Script and inputfile

The resultfile

The result

You can find the script in our GitHub repo.

Have fun with it. Feel free to contact us with any questions. If we have awakened your interest in an AD assessment, you are also welcome to contact us at 😊.

LATEST POSTS