Update ATT&CK data for the HELK kibana dashboard.
Project description
MITRE DASH-board🏄🏽♂️
Update the MiTRE dashboard provided by HELK.
TLDR
This package loads the most recent data from the MITRE ATT&CK® framework and and converts it to parsable csv files.
In a second step the data can be parsed by logstash and imported into your elasticsearch instance. This step should be repeated in an interval of 3 month to keep your CTI up to date!
Additionally, you can upload the metadata of the HELK dashboard to your Kibana endpoint. This step is necessary only once and if the dashboard wasn't imported previously.
Usage
Package installation
pip install mitre-dash
Get and process the latest Mitre data
attack_fetch
Optional flags:
--matrix_nameto specify the matrix. Options areenterprise-attack,mobile-attackandics-attack.--include_subtechniquesto include sub_techniques. This will result in a factor 10 increase in data size (20GB).--include_detectionto include detection methods. This is very verbose, will increase log size and might break logstash parsing due to special characters.--include_descriptionsto include descriptions of techniques, software, groups, etc. Same as above ⬆️.--output_dirto specify the output directory. Default is./output.--helpor-hto get help.
Import dashboard to Kibana
In total you will import:
- 1 index pattern
- 2 dashboards
- a bunch of custom visualizations
Run:
attack_dash_up
Next you will be prompted to enter your Kibana host-address and credentials. We use the same script as the HELK to import the metadata.
⚠️ This step is only necessary once and if the dashboard wasn't imported previously!!!
Parse data with logstash
Now upload the new data to your elasticsearch instance:
attack_parse
Next you will be prompted to enter your elasticsearch host-address and credentials.
This will create a 'logstash' folder in your current directory with:
logstash.confto parse the csv data.docker-compose.ymlto run logstash as a contained service..envto facilitate the provided credentials.
Finally two parallel processes will be started:
docker-compose upto run logstash.nc <logstash-host> 32173 -q 11 < output/<attack-matrix>.csvto send the data to logstash.
The second process is repeated for every attack matrix csv file in the output folder.
⚠️ Prepare for a long ride. Get some popcorn while 13M logs are ingested 🍿
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file attack_dashboard-0.11.11.tar.gz.
File metadata
- Download URL: attack_dashboard-0.11.11.tar.gz
- Upload date:
- Size: 19.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.90.1-microsoft-standard-WSL2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e24fa811d5f5d36bf4cbf6b22c5cb5fabeec24b239a6b10e20ed41c564b30da3
|
|
| MD5 |
8af8ed87f20648f7f395511ac15ef744
|
|
| BLAKE2b-256 |
ee46e5e82e620f9f615d52df066ac82d72937ccf8b648a7020a1d164fdcf24c4
|
File details
Details for the file attack_dashboard-0.11.11-py3-none-any.whl.
File metadata
- Download URL: attack_dashboard-0.11.11-py3-none-any.whl
- Upload date:
- Size: 37.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.90.1-microsoft-standard-WSL2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6d4c710f9fbec654f6a572bb74227bebc646d3adfb361cadea99c49b3e28fee7
|
|
| MD5 |
0cea8ba2490cd9045d714c8b3127778c
|
|
| BLAKE2b-256 |
f982ac8ec7683926ea9e3b44e1fa267a08a9242d75d3263e08be0c2a84fa4407
|