Skip to main content

Update ATT&CK data for the HELK kibana dashboard.

Project description

MITRE DASH-board🏄🏽‍♂️

Update the MiTRE dashboard provided by HELK.

TLDR

This package loads the most recent data from the MITRE ATT&CK® framework and and converts it to parsable csv files.

In a second step the data can be parsed by logstash and imported into your elasticsearch instance. This step should be repeated in an interval of 3 month to keep your CTI up to date!

Additionally, you can upload the metadata of the HELK dashboard to your Kibana endpoint. This step is necessary only once and if the dashboard wasn't imported previously.

helkDashAll helkDashApt

Usage

Package installation

pip install mitre-dash

Get and process the latest Mitre data

attack_fetch

Optional flags:

  • --matrix_name to specify the matrix. Options are enterprise-attack, mobile-attack and ics-attack.
  • --include_subtechniques to include sub_techniques. This will result in a factor 10 increase in data size (20GB).
  • --include_detection to include detection methods. This is very verbose, will increase log size and might break logstash parsing due to special characters.
  • --include_descriptions to include descriptions of techniques, software, groups, etc. Same as above ⬆️.
  • --output_dir to specify the output directory. Default is ./output.
  • --help or -h to get help.

Import dashboard to Kibana

In total you will import:

  • 1 index pattern
  • 2 dashboards
  • a bunch of custom visualizations

Run:

attack_dash_up

Next you will be prompted to enter your Kibana host-address and credentials. We use the same script as the HELK to import the metadata.

⚠️ This step is only necessary once and if the dashboard wasn't imported previously!!!

Parse data with logstash

Now upload the new data to your elasticsearch instance:

attack_parse

Next you will be prompted to enter your elasticsearch host-address and credentials.

This will create a 'logstash' folder in your current directory with:

  • logstash.conf to parse the csv data.
  • docker-compose.yml to run logstash as a contained service.
  • .env to facilitate the provided credentials.

Finally two parallel processes will be started:

  • docker-compose up to run logstash.
  • nc <logstash-host> 32173 -q 11 < output/<attack-matrix>.csv to send the data to logstash.

The second process is repeated for every attack matrix csv file in the output folder.

⚠️ Prepare for a long ride. Get some popcorn while 13M logs are ingested 🍿

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

attack_dashboard-0.11.11.tar.gz (19.5 kB view hashes)

Uploaded Source

Built Distribution

attack_dashboard-0.11.11-py3-none-any.whl (37.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page