Skip to main content

Update ATT&CK data for the HELK kibana dashboard.

Project description

MITRE DASH-board🏄🏽‍♂️

Update the MiTRE dashboard provided by HELK.

TLDR

This package loads the most recent data from the MITRE ATT&CK® framework and and converts it to parsable csv files.

In a second step the data can be parsed by logstash and imported into your elasticsearch instance. This step should be repeated in an interval of 3 month to keep your CTI up to date!

Additionally, you can upload the metadata of the HELK dashboard to your Kibana endpoint. This step is necessary only once and if the dashboard wasn't imported previously.

helkDashAll helkDashApt

Usage

Package installation

pip install mitre-dash

Get and process the latest Mitre data

attack_fetch

Optional flags:

  • --matrix_name to specify the matrix. Options are enterprise-attack, mobile-attack and ics-attack.
  • --include_subtechniques to include sub_techniques. This will result in a factor 10 increase in data size (20GB).
  • --include_detection to include detection methods. This is very verbose, will increase log size and might break logstash parsing due to special characters.
  • --include_descriptions to include descriptions of techniques, software, groups, etc. Same as above ⬆️.
  • --output_dir to specify the output directory. Default is ./output.
  • --help or -h to get help.

Import dashboard to Kibana

In total you will import:

  • 1 index pattern
  • 2 dashboards
  • a bunch of custom visualizations

Run:

attack_dash_up

Next you will be prompted to enter your Kibana host-address and credentials. We use the same script as the HELK to import the metadata.

⚠️ This step is only necessary once and if the dashboard wasn't imported previously!!!

Parse data with logstash

Now upload the new data to your elasticsearch instance:

attack_parse

Next you will be prompted to enter your elasticsearch host-address and credentials.

This will create a 'logstash' folder in your current directory with:

  • logstash.conf to parse the csv data.
  • docker-compose.yml to run logstash as a contained service.
  • .env to facilitate the provided credentials.

Finally two parallel processes will be started:

  • docker-compose up to run logstash.
  • nc <logstash-host> 32173 -q 11 < output/<attack-matrix>.csv to send the data to logstash.

The second process is repeated for every attack matrix csv file in the output folder.

⚠️ Prepare for a long ride. Get some popcorn while 13M logs are ingested 🍿

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

attack_dashboard-0.11.11.tar.gz (19.5 kB view details)

Uploaded Source

Built Distribution

attack_dashboard-0.11.11-py3-none-any.whl (37.1 kB view details)

Uploaded Python 3

File details

Details for the file attack_dashboard-0.11.11.tar.gz.

File metadata

  • Download URL: attack_dashboard-0.11.11.tar.gz
  • Upload date:
  • Size: 19.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.90.1-microsoft-standard-WSL2

File hashes

Hashes for attack_dashboard-0.11.11.tar.gz
Algorithm Hash digest
SHA256 e24fa811d5f5d36bf4cbf6b22c5cb5fabeec24b239a6b10e20ed41c564b30da3
MD5 8af8ed87f20648f7f395511ac15ef744
BLAKE2b-256 ee46e5e82e620f9f615d52df066ac82d72937ccf8b648a7020a1d164fdcf24c4

See more details on using hashes here.

File details

Details for the file attack_dashboard-0.11.11-py3-none-any.whl.

File metadata

  • Download URL: attack_dashboard-0.11.11-py3-none-any.whl
  • Upload date:
  • Size: 37.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.90.1-microsoft-standard-WSL2

File hashes

Hashes for attack_dashboard-0.11.11-py3-none-any.whl
Algorithm Hash digest
SHA256 6d4c710f9fbec654f6a572bb74227bebc646d3adfb361cadea99c49b3e28fee7
MD5 0cea8ba2490cd9045d714c8b3127778c
BLAKE2b-256 f982ac8ec7683926ea9e3b44e1fa267a08a9242d75d3263e08be0c2a84fa4407

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page