Skip to main content

Advanced Drone Flight Log Entity Recognizer

Project description

ADFLER: Advanced Drone Flight Log Entity Recognizer

ADFLER is an open-source tool developed to perform named entity recognition on drone flight log data to support forensic investigations. It uses a fine-tuned ALBERT model to identify entities (like Event and NonEvent) in log messages and constructs a forensic timeline.

Features

  • Forensic Timeline Construction: Merges and sorts logs from Android and iOS devices.
  • Entity Recognition: Uses an ALBERT model to highlight key entities in log messages.
  • Forensic Report Generation: Generates a PDF report with statistics and the highlighted timeline.

Installation

Prerequisites

  1. Python 3.7+
  2. wkhtmltopdf: Required for report generation.
    • Windows: Download and install from wkhtmltopdf.org. Default path: C:\Program Files\wkhtmltopdf\bin\wkhtmltopdf.exe.
    • Linux: sudo apt-get install wkhtmltopdf. Default path: /usr/bin/wkhtmltopdf.

Install ADFLER

You can install ADFLER directly from PyPI (coming soon) or from source.

From Source:

git clone https://github.com/DroneNLP/ADFLER.git
cd ADFLER
pip install .

Model

ADFLER uses a fine-tuned ALBERT model (swardiantara/ADFLER-albert-base-v2). By default, the tool will automatically download and cache the model from Hugging Face on its first run.

If you prefer to use a locally downloaded model or an offline environment, you can point to your local directory containing pytorch_model.bin using the --model argument.

Usage

ADFLER provides a Command Line Interface (CLI).

Structure:

adfler [arguments]

Arguments

Argument Description
--evidence <path> Path to the directory containing input flight logs (must have android and/or ios subfolders).
--output <path> Path to the directory where results will be saved.
--model <path> Path to the directory containing the ALBERT model (pytorch_model.bin).
--config <path> Path to a custom config.json file.

Examples

1. Run Pipeline (Recommended):

adfler --evidence "./flight_logs" --output "./results/case_001"

2. Run Pipeline with Custom Model:

adfler --evidence "./flight_logs" --output "./results/case_001" --model "./local_model_dir"

Output Structure

Running adfler will create the following structure in your output directory:

output_dir/
├── raw_list.json             # List of detected log files
├── forensic_timeline.csv     # Merged and sorted timeline of all logs
├── ner_result.json           # Timeline with detected entities (JSON format)
├── statistics.json           # Counts of detected entity types
├── forensic_report_.html     # Intermediate HTML report
├── forensic_report_.pdf      # Final PDF Forensic Report
└── parsed/                   # Intermediate parsed CSVs
    ├── android/
    └── ios/

Configuration

You can optionally use a config.json file instead of passing long arguments.

Example config.json:

{
    "source_evidence": "./flight_logs",
    "output_dir": "./results/test_run",
    "model_dir": "./model",
    "wkhtml_path": {
        "windows": "C:\\Program Files\\wkhtmltopdf\\bin\\wkhtmltopdf.exe",
        "linux": "/usr/bin/wkhtmltopdf"
    },
    "use_cuda": true
}

Run with config:

adfler --config my_config.json

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

adfler-0.1.0-py3-none-any.whl (13.7 kB view details)

Uploaded Python 3

File details

Details for the file adfler-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: adfler-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 13.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for adfler-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a07a459e71d6311bcda6d1123c3cf251ad80986da37d58089b8794f993a4b713
MD5 91deb2037c4cd5d998aeb23ac5b03336
BLAKE2b-256 30bf19536d371cd909a282a1b6dffc9f8049f169a535fda7c8ce9d7280d3b3df

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page