Skip to main content

Scope is an Open Source Cloud Forensics tool for AWS. Scope can rapidly obtain logs and create super timelines for analysis.

Project description

Scope - Cloud Forensics Tool

Scope is an open source tool for collecting and analyzing cloud logs for forensic investigations. Scope currently supports AWS CloudTrail logs with plans to extend to Azure and GCP in the future.

Features

  • AWS CloudTrail Collection: Retrieve logs from S3 buckets or via the Management Events API
  • Normalized Timeline: Convert cloud logs into a standardized timeline format
  • Multiple Export Formats: Export timelines as CSV or JSON
  • Resource Discovery: Identify available CloudTrail trails in your AWS account

Installation

Using pip (Recommended)

pip install scope-forensics

From Source

# Clone the repository
git clone https://github.com/scope-forensics/scope.git
cd scope

# Install the package
pip install .

# For development (editable mode)
pip install -e .

Usage

Basic Commands

# Display help information
scope --help

# List available commands
scope aws --help

AWS Authentication

Scope supports multiple authentication methods:

  1. Interactive configuration:

    # Configure AWS credentials interactively
    scope aws configure
    
    # Configure for a specific profile
    scope aws configure --profile my-profile
    
  2. Command-line arguments:

    scope aws --access-key YOUR_ACCESS_KEY --secret-key YOUR_SECRET_KEY --region us-east-1 discover
    
  3. Environment variables:

    # Windows
    set AWS_ACCESS_KEY_ID=your_access_key
    set AWS_SECRET_ACCESS_KEY=your_secret_key
    set AWS_DEFAULT_REGION=us-east-1
    
    # macOS/Linux
    export AWS_ACCESS_KEY_ID=your_access_key
    export AWS_SECRET_ACCESS_KEY=your_secret_key
    export AWS_DEFAULT_REGION=us-east-1
    
  4. AWS credentials file (~/.aws/credentials)

  5. IAM role (if running on an EC2 instance with an IAM role)

Setting Up AWS Permissions

To use Scope effectively, you'll need an AWS user with appropriate permissions. Here's how to create one:

  1. Sign in to the AWS Management Console and open the IAM console.

  2. Create a new policy:

    • Go to "Policies" and click "Create policy"
    • Use the JSON editor and paste the following policy:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "cloudtrail:LookupEvents",
                    "cloudtrail:DescribeTrails",
                    "s3:GetObject",
                    "s3:ListBucket",
                    "s3:GetBucketLocation"
                ],
                "Resource": "*"
            }
        ]
    }
    
    • Name the policy "ScopeForensicsPolicy" and create it
  3. Create a new user:

    • Go to "Users" and click "Add users"
    • Enter a username (e.g., "scope-forensics")
    • Select "Access key - Programmatic access"
    • Click "Next: Permissions"
    • Select "Attach existing policies directly"
    • Search for and select the "ScopeForensicsPolicy" you created
    • Complete the user creation process
  4. Save the credentials:

    • Download or copy the Access Key ID and Secret Access Key
    • Use these credentials with the scope aws configure command

Note: Consider using more restrictive permissions by limiting the "Resource" section to specific S3 buckets and CloudTrail trails.

Discover CloudTrail Trails

To list all available CloudTrail trails in your AWS account:

scope aws discover

This command will display information about each trail, including its name, S3 bucket location, and whether it logs management events.

Explore S3 Bucket Structure

To explore the structure of an S3 bucket and automatically detect CloudTrail logs:

scope aws explore-bucket --bucket your-cloudtrail-bucket

This command will:

  1. List top-level prefixes in the bucket
  2. Automatically detect potential CloudTrail log paths
  3. Provide a ready-to-use command for collecting logs from the detected paths

Collect Management Events

To collect CloudTrail management events:

scope aws management --days 7 --output-file timeline.csv --format csv

Available parameters:

  • --days: Number of days to look back (default: 7)
  • --output-file: Path to save the timeline (required)
  • --format: Choose between 'csv' or 'json' (default: csv)

Collect from S3

To collect CloudTrail logs stored in an S3 bucket:

scope aws s3 --bucket your-cloudtrail-bucket --output-file timeline.csv

The command will automatically:

  1. Discover the CloudTrail log structure in your bucket
  2. Identify all available regions
  3. Collect logs from all regions for the specified time period

For more control, you can specify additional parameters:

scope aws s3 --bucket your-cloudtrail-bucket --prefix AWSLogs/123456789012/CloudTrail/ --regions us-east-1 us-west-2 --start-date 2023-04-15 --end-date 2023-04-22 --output-dir ./raw_logs --output-file timeline.csv --format json

Available parameters:

  • --bucket: S3 bucket containing CloudTrail logs (required)
  • --prefix: S3 prefix to filter logs (optional)
  • --regions: Specific regions to collect from (space-separated list)
  • --start-date: Start date in YYYY-MM-DD format (default: 7 days ago)
  • --end-date: End date in YYYY-MM-DD format (default: today)
  • --output-dir: Directory to save raw logs (optional)
  • --output-file: Path to save the timeline (required)
  • --format: Choose between 'csv' or 'json' (default: csv)

Collect from Local Files

To process CloudTrail logs that have already been downloaded to your local machine:

scope aws local --directory /path/to/logs --output-file timeline.csv

For recursive processing of all subdirectories:

scope aws local --directory /path/to/logs --recursive --output-file timeline.csv --format json

Note for Windows users: When specifying file paths, use one of these formats:

  • Forward slashes: C:/Users/username/Desktop/CloudTrail
  • Escaped backslashes: C:\\Users\\username\\Desktop\\CloudTrail
  • Quoted paths: "C:\Users\username\Desktop\CloudTrail"

Available parameters:

  • --directory: Directory containing CloudTrail logs (required)
  • --recursive: Process subdirectories recursively
  • --output-file: Path to save the timeline (required)
  • --format: Choose between 'csv' or 'json' (default: csv)

This command will:

  1. Find all CloudTrail log files (.json or .json.gz) in the specified directory
  2. Parse and normalize the events
  3. Create a standardized timeline in the specified format

Exporting Timelines

By default, Scope exports timelines to the specified output file. You can specify betwen csv and json formats.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scope_forensics-0.1.3.tar.gz (21.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scope_forensics-0.1.3-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file scope_forensics-0.1.3.tar.gz.

File metadata

  • Download URL: scope_forensics-0.1.3.tar.gz
  • Upload date:
  • Size: 21.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for scope_forensics-0.1.3.tar.gz
Algorithm Hash digest
SHA256 0395dcccb369f52ce8b0c043c13a082a7822d304ba53154c659d3e993434f1d6
MD5 4d329610e191f81c48d7243f7190be7a
BLAKE2b-256 9daf7a85be1d80b3fba5006bca50e0131e4431b051fa851109221e6beb46cbe8

See more details on using hashes here.

File details

Details for the file scope_forensics-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for scope_forensics-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d59eb0bb12f1ec67101e8bd473941b7762c297c3713e79c5e24b08c139e3fbc3
MD5 425232b611eff708e69c9a8c78609cd1
BLAKE2b-256 76487ce72173fe15e44505ba8353aec72b79b1fb672725cf9b9f1df12135403e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page