Skip to main content

Modern network discovery and fact collection toolkit with secure credential management and structured logging

Project description

Simple Facts - Network Discovery & Fact Collection Toolkit

PyPI Python License

Automated network inventory toolkit that syncs your actual network data to NetBox or Nautobot IPAM in minutes. Discover devices, collect facts, and eliminate manual documentation drift. Built on Netmiko, TextFSM, PyNetBox, and PyNautobot. Recently updated with Prefect.

Features

  • ๐Ÿ” Network Discovery: Automatic device discovery with SSH port scanning
  • ๐Ÿ”ง Device Identification: Auto-detect device types using Netmiko SSHDetect
  • ๐Ÿ“Š Fact Collection: Collect structured device information using TextFSM
  • ๐Ÿข NetBox Integration: Complete IPAM/DCIM synchronization with VRF support
  • ๐Ÿค– Nautobot Integration: 1:1 equivalent of the NetBox sync via the official pynautobot SDK (Nautobot 2.x/3.x โ€” locations, namespaces, statuses, IPโ†”Interface assignments)
  • ๐Ÿ”„ Prefect Workflows: Modern workflow orchestration with automated pipelines and beautiful UI
  • ** Secure Credentials**: OS-native keyring storage with .env file support
  • ๐Ÿš€ High Performance: Parallel processing with configurable workers
  • ๐Ÿ“ Structured Logging: Professional logging with Loguru
  • ๐ŸŒ SSH Tunnel Support: Test mode for containerlab and remote environments
  • ๐Ÿ“ Flexible Output: Custom output directories with --output-dir option
  • ๐Ÿงน Code Quality: Ruff linting and formatting for clean, maintainable code
  • โš™๏ธ CI/CD Pipeline: GitLab CI/CD with lint, test, security, and build stages
  • ๐Ÿ—๏ธ Modern Architecture: Built with uv, Loguru, and industry standards

Architecture

-- Code Building Blocks

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                         sfacts/                                 โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
โ”‚  โ”‚  sfacts/core/                                           โ”‚    โ”‚
โ”‚  โ”‚  โ€ข scanner.py      - Network discovery & port scanning  โ”‚    โ”‚
โ”‚  โ”‚  โ€ข collector.py    - Fact collection & TextFSM parsing  โ”‚    โ”‚
โ”‚  โ”‚  โ€ข netbox/         - NetBox sync modules                โ”‚    โ”‚
โ”‚  โ”‚  โ€ข nautobot/       - Nautobot sync modules (pynautobot) โ”‚    โ”‚
โ”‚  โ”‚  โ€ข utils/         - Auth, logging, helpers              โ”‚    โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
โ”‚  โ”‚  sfacts/tasks/                                          โ”‚    โ”‚
โ”‚  โ”‚  โ€ข collect.py      - Fact collection task               โ”‚    โ”‚
โ”‚  โ”‚  โ€ข netbox.py       - NetBox sync task                   โ”‚    โ”‚
โ”‚  โ”‚  โ€ข nautobot.py     - Nautobot sync task                 โ”‚    โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
โ”‚  โ”‚  sfacts/flows/                                          โ”‚    โ”‚
โ”‚  โ”‚  โ€ข modular_flows.py - Prefect flow definitions          โ”‚    โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
โ”‚  โ”‚  sfacts/cli/                                            โ”‚    โ”‚
โ”‚  โ”‚  โ€ข main.py         - CLI entry point                    โ”‚    โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                      Prefect Orchestration                      โ”‚
โ”‚  โ€ข serve.py        - Deploy Prefect flows                       โ”‚
โ”‚  โ€ข deployments.py  - Flow deployment configuration              โ”‚
โ”‚  โ€ข UI @ localhost:4200 for workflow management                  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

CLI Commands

Command Purpose Example
sfacts discover Network device discovery sfacts discover --subnet 192.168.1.0/24
sfacts facts Enhanced fact collection sfacts facts --input discovery_results.json
sfacts export Export to Nornir inventory sfacts export --output-dir inventory
sfacts netbox sync Sync facts to NetBox IPAM/DCIM sfacts netbox sync --site-name "Production"
sfacts netbox test Test NetBox API connection sfacts netbox test
sfacts netbox status Show NetBox instance statistics sfacts netbox status
sfacts netbox purge Delete NetBox data (DESTRUCTIVE) sfacts netbox purge --dry-run
sfacts netbox validate Validate facts for NetBox sync sfacts netbox validate
sfacts nautobot sync Sync facts to Nautobot IPAM/DCIM sfacts nautobot sync --site-name "Production"
sfacts nautobot test Test Nautobot API connection sfacts nautobot test
sfacts nautobot status Show Nautobot instance statistics sfacts nautobot status
sfacts nautobot purge Delete Nautobot data (DESTRUCTIVE) sfacts nautobot purge --dry-run

Discovery Modes

  • Subnet Discovery: Scan network ranges for SSH-enabled devices
  • SSH Tunnel Mode: Connect through SSH tunnels for testing (containerlab, etc.)

Installation

This project uses uv for fast Python package management. The project requires Python 3.9 or higher (pynautobot and Prefect both require 3.9+).

Install uv (if not already installed)

curl -LsSf https://astral.sh/uv/install.sh | sh

Install dependencies

uv sync

Alternative: Using pip (legacy)

pip install -r requirements.txt

Quick Start

1. Install Dependencies

uv sync

2. Set up credentials (choose one method):

Option A: Environment Variables

export NETOPS_USERNAME=your_username
export NETOPS_PASSWORD=your_password

Option B: Secure Keyring Storage

# Store credentials securely in OS keyring
uv run python -c "
from sfacts.utils.auth import CredentialManager
cm = CredentialManager()
cm.store_credentials('your_username', 'your_password')
"

3. Configure NetBox Integration (Optional)

Create a .env file for NetBox integration:

# Copy template and configure
cp .env.template .env

# Edit .env with your NetBox details
NETBOX_URL=https://your-netbox-instance.com
NETBOX_TOKEN=your-api-token-here

4. Discover Network Devices

Subnet Discovery

# Discover devices in a network range
uv run sfacts discover --subnet 192.168.1.0/24

# Save results to specific directory
uv run sfacts discover --subnet 192.168.1.0/24 --output-dir ./network_scans

# High-performance scanning with custom output
uv run sfacts discover --subnet 10.0.0.0/16 --workers 50 --timeout 15 --output-dir ./results

SSH Tunnel Mode (for testing/containerlab)

# Connect through SSH tunnels
uv run sfacts discover --tunnel-host localhost --tunnel-ports 2101-2116

5. Collect Device Facts

# Collect facts from discovered devices
uv run sfacts facts --input disco_results_*.json

# Save facts to specific directory
uv run sfacts facts --input disco_results.json --output-dir ./network_facts

# Direct subnet fact collection with custom output
uv run sfacts facts --subnet 192.168.1.0/24 --workers 50 --output-dir ./results

6. NetBox Integration (Optional)

Test NetBox Connection

# Test API connectivity
uv run sfacts netbox test

# Check NetBox instance status and data statistics
uv run sfacts netbox status

Synchronize to NetBox IPAM/DCIM

# Dry-run to preview changes
uv run sfacts netbox sync --site-name "Production" --dry-run

# Sync device facts to NetBox
uv run sfacts netbox sync --site-name "Production"

# Validate facts data before sync
uv run sfacts netbox validate

NetBox Data Management

# Preview what would be deleted (SAFE)
uv run sfacts netbox purge --dry-run

# Delete all NetBox data (DESTRUCTIVE - requires confirmation)
uv run sfacts netbox purge

# Delete specific site data only
uv run sfacts netbox purge --site-name "Lab Environment" --dry-run

6b. Nautobot Integration (alternative to NetBox)

The Nautobot backend mirrors the NetBox surface area 1:1 and uses the official pynautobot SDK end-to-end. Tested against Nautobot 3.x.

Set credentials via environment (or .env):

export NAUTOBOT_URL=http://localhost:8080
export NAUTOBOT_TOKEN=your-api-token

Run a local Nautobot for development

A turnkey Nautobot 3.1 lab is provided under labs/nautobot/:

docker compose -f labs/nautobot/docker-compose.yml up -d
./labs/nautobot/bootstrap.sh         # creates admin/admin + API token
uv run sfacts nautobot test

Synchronize to Nautobot

# Test connection / inspect statistics
uv run sfacts nautobot test
uv run sfacts nautobot status

# Dry-run preview
uv run sfacts nautobot sync --site-name "Production" --dry-run

# Real sync (standard mode)
uv run sfacts nautobot sync --site-name "Production"

# Bulk mode (10-50x faster, optional --adaptive batching)
uv run sfacts nautobot sync --site-name "Production" --bulk

Nautobot data management

# Preview what would be deleted
uv run sfacts nautobot purge --dry-run

# Scoped purge (one Location and its dependents)
uv run sfacts nautobot purge --site-name "Lab Environment"

# Full purge (DESTRUCTIVE; also removes the auto-created LocationType)
uv run sfacts nautobot purge

Schema notes (Nautobot 3.x): Sites are modeled as Locations (under a LocationType named Site that we auto-provision); device roles live in the generic extras.roles registry; prefixes/IPs live in a Namespace (default Global); IPโ†”Interface assignments use the ipam.ip_address_to_interface join model; and Nautobot stores MAC addresses directly on Interface.mac_address (no separate MAC repository).

7. Prefect Workflow Orchestration (Optional)

Install Prefect workflows

uv sync --extra workflows

Start Prefect server and deployments

# Terminal 1: Start Prefect server
prefect server start

# Terminal 2: Start deployments
uv run python serve.py

Access Prefect UI

Seven deployments available:

Pipeline:

  1. discover-and-collect โ€” Main entry point: discover + collect facts as subflows (one click)
  2. discover-only โ€” Discovery only (re-runs, topology changes)
  3. collect-only โ€” Fact collection only (uses latest devices-json artifact)
  4. sync-to-netbox โ€” Push facts to NetBox IPAM. Mandatory preflight + postflight with before/after diff
  5. sync-to-nautobot โ€” Push facts to Nautobot IPAM/DCIM. Mandatory preflight + postflight with before/after diff

Admin: 6. purge-netbox โ€” โš ๏ธ Destructive NetBox cleanup. dry_run=True by default. Mandatory preflight + postflight 7. purge-nautobot โ€” โš ๏ธ Destructive Nautobot cleanup. dry_run=True by default. Mandatory preflight + postflight

Example workflow:

# Run from Prefect UI (http://localhost:4200):
# 1. discover-and-collect with target: "192.168.121.0/24"
#    โ†’ produces devices-json + facts-json artifacts
# 2. sync-to-netbox (automatically uses latest facts)

Benefits:

  • โœ… Single-click pipeline (discover + collect combined)
  • โœ… Zero manual JSON copying between steps
  • โœ… Before/after NetBox object-count diff on every sync and purge
  • โœ… Preflight abort if NetBox is unreachable โ€” no blind writes
  • โœ… Individual steps (discover-only, collect-only) for targeted re-runs

See Prefect Modular Workflows for complete documentation.

Documentation

Output Structure

output/
โ”œโ”€โ”€ disco_results_YYYYMMDD_HHMMSS.json        # Network discovery results
โ”œโ”€โ”€ facts_results_YYYYMMDD_HHMMSS.json        # Device facts with TextFSM parsing
โ””โ”€โ”€ report/
    โ””โ”€โ”€ network_report_YYYYMMDD_HHMMSS.md     # Comprehensive network report

Environment Variables

The following environment variables are supported:

Device Authentication:

  • NETOPS_USERNAME - SSH username for device authentication
  • NETOPS_PASSWORD - SSH password for device authentication
  • NETOPS_SECRET - Enable secret for privileged access (optional)

NetBox Integration:

  • NETBOX_URL - NetBox instance URL (e.g., https://netbox.company.com)
  • NETBOX_TOKEN - NetBox API token for authentication
  • NETBOX_VERIFY_SSL - SSL certificate verification (default: true)

Development

Code Quality Tools

This project uses modern Python development tools:

# Install development dependencies
uv sync --dev

# Run linting and formatting
uv run ruff check sfacts/ tests/
uv run ruff check --fix sfacts/ tests/
uv run ruff format sfacts/ tests/

# Type checking
uv run mypy sfacts/

# Security scanning
uv run bandit -r sfacts/

# Run tests with coverage
uv run pytest

CI/CD Pipeline

The project includes a GitLab CI/CD pipeline (.gitlab-ci.yml) with four stages:

  1. Lint - Ruff linting, formatting checks, MyPy type checking
  2. Test - Unit tests with coverage reporting
  3. Security - Dependency vulnerability scanning, Bandit security analysis, code quality metrics
  4. Build - Python package creation (main branch and tags)

See CI/CD Pipeline for full setup details.

Key Technologies

  • uv - Fast Python package management
  • Prefect - Modern workflow orchestration and scheduling
  • Loguru - Structured logging with colors and context
  • Keyring - Secure OS-native credential storage
  • python-dotenv - Environment variable management from .env files
  • Netmiko - Multi-vendor SSH device connections
  • TextFSM - Structured parsing with ntc-templates
  • pynetbox - NetBox API client for IPAM/DCIM integration
  • Rich - Beautiful progress bars and terminal output

Additional Features

Nornir Inventory Export

Export collected facts to Nornir inventory format for integration with Nornir-based automation:

# Export latest facts to Nornir inventory
uv run sfacts export

# Export specific facts file
uv run sfacts export --input-file facts_results_20250124.json

# Include credentials in defaults.yaml
uv run sfacts export --include-credentials

See inventory/README.md for details on using exported inventory files.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.


Built and maintained by NETODATA โ€” network automation, done right.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sfacts-2.5.0.tar.gz (375.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sfacts-2.5.0-py3-none-any.whl (121.0 kB view details)

Uploaded Python 3

File details

Details for the file sfacts-2.5.0.tar.gz.

File metadata

  • Download URL: sfacts-2.5.0.tar.gz
  • Upload date:
  • Size: 375.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for sfacts-2.5.0.tar.gz
Algorithm Hash digest
SHA256 436944420227cb3dab15595e9ce2f8219a0ba8b9c75ab232e92b33d5da06b226
MD5 49e321200a4ce0439c2c26944e4c9e93
BLAKE2b-256 0b2da4ade6086abd4d0ac8a9d23ef8f28dfced2df6c680c9dbf1e7dca8507be1

See more details on using hashes here.

File details

Details for the file sfacts-2.5.0-py3-none-any.whl.

File metadata

  • Download URL: sfacts-2.5.0-py3-none-any.whl
  • Upload date:
  • Size: 121.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for sfacts-2.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 726b2605aa59380321ab8a5a27cf7ad1b29a2a4edcf1b1efb5de40dcec27e562
MD5 9602c002e07a442d6f6490f65298e653
BLAKE2b-256 c7d289f1c5b11e1d04f5dfa76d9bc839d4ec967c68587b9233d3b01b34add927

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page