Modern network discovery and fact collection toolkit with secure credential management and structured logging
Project description
Simple Facts - Network Discovery & Fact Collection Toolkit
Automated network inventory toolkit that syncs your actual network data to NetBox or Nautobot IPAM in minutes. Discover devices, collect facts, and eliminate manual documentation drift. Built on Netmiko, TextFSM, PyNetBox, and PyNautobot. Recently updated with Prefect.
Features
- ๐ Network Discovery: Automatic device discovery with SSH port scanning
- ๐ง Device Identification: Auto-detect device types using Netmiko SSHDetect
- ๐ Fact Collection: Collect structured device information using TextFSM
- ๐ข NetBox Integration: Complete IPAM/DCIM synchronization with VRF support
- ๐ค Nautobot Integration: 1:1 equivalent of the NetBox sync via the official
pynautobotSDK (Nautobot 2.x/3.x โ locations, namespaces, statuses, IPโInterface assignments) - ๐ Prefect Workflows: Modern workflow orchestration with automated pipelines and beautiful UI
- ** Secure Credentials**: OS-native keyring storage with .env file support
- ๐ High Performance: Parallel processing with configurable workers
- ๐ Structured Logging: Professional logging with Loguru
- ๐ SSH Tunnel Support: Test mode for containerlab and remote environments
- ๐ Flexible Output: Custom output directories with
--output-diroption - ๐งน Code Quality: Ruff linting and formatting for clean, maintainable code
- โ๏ธ CI/CD Pipeline: GitLab CI/CD with lint, test, security, and build stages
- ๐๏ธ Modern Architecture: Built with uv, Loguru, and industry standards
Architecture
-- Code Building Blocks
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ sfacts/ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ sfacts/core/ โ โ
โ โ โข scanner.py - Network discovery & port scanning โ โ
โ โ โข collector.py - Fact collection & TextFSM parsing โ โ
โ โ โข netbox/ - NetBox sync modules โ โ
โ โ โข nautobot/ - Nautobot sync modules (pynautobot) โ โ
โ โ โข utils/ - Auth, logging, helpers โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ sfacts/tasks/ โ โ
โ โ โข collect.py - Fact collection task โ โ
โ โ โข netbox.py - NetBox sync task โ โ
โ โ โข nautobot.py - Nautobot sync task โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ sfacts/flows/ โ โ
โ โ โข modular_flows.py - Prefect flow definitions โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ sfacts/cli/ โ โ
โ โ โข main.py - CLI entry point โ โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Prefect Orchestration โ
โ โข serve.py - Deploy Prefect flows โ
โ โข deployments.py - Flow deployment configuration โ
โ โข UI @ localhost:4200 for workflow management โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
CLI Commands
| Command | Purpose | Example |
|---|---|---|
sfacts discover |
Network device discovery | sfacts discover --subnet 192.168.1.0/24 |
sfacts facts |
Enhanced fact collection | sfacts facts --input discovery_results.json |
sfacts export |
Export to Nornir inventory | sfacts export --output-dir inventory |
sfacts netbox sync |
Sync facts to NetBox IPAM/DCIM | sfacts netbox sync --site-name "Production" |
sfacts netbox test |
Test NetBox API connection | sfacts netbox test |
sfacts netbox status |
Show NetBox instance statistics | sfacts netbox status |
sfacts netbox purge |
Delete NetBox data (DESTRUCTIVE) | sfacts netbox purge --dry-run |
sfacts netbox validate |
Validate facts for NetBox sync | sfacts netbox validate |
sfacts nautobot sync |
Sync facts to Nautobot IPAM/DCIM | sfacts nautobot sync --site-name "Production" |
sfacts nautobot test |
Test Nautobot API connection | sfacts nautobot test |
sfacts nautobot status |
Show Nautobot instance statistics | sfacts nautobot status |
sfacts nautobot purge |
Delete Nautobot data (DESTRUCTIVE) | sfacts nautobot purge --dry-run |
Discovery Modes
- Subnet Discovery: Scan network ranges for SSH-enabled devices
- SSH Tunnel Mode: Connect through SSH tunnels for testing (containerlab, etc.)
Installation
This project uses uv for fast Python package management. The project requires Python 3.9 or higher (pynautobot and Prefect both require 3.9+).
Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
Install dependencies
uv sync
Alternative: Using pip (legacy)
pip install -r requirements.txt
Quick Start
1. Install Dependencies
uv sync
2. Set up credentials (choose one method):
Option A: Environment Variables
export NETOPS_USERNAME=your_username
export NETOPS_PASSWORD=your_password
Option B: Secure Keyring Storage
# Store credentials securely in OS keyring
uv run python -c "
from sfacts.utils.auth import CredentialManager
cm = CredentialManager()
cm.store_credentials('your_username', 'your_password')
"
3. Configure NetBox Integration (Optional)
Create a .env file for NetBox integration:
# Copy template and configure
cp .env.template .env
# Edit .env with your NetBox details
NETBOX_URL=https://your-netbox-instance.com
NETBOX_TOKEN=your-api-token-here
4. Discover Network Devices
Subnet Discovery
# Discover devices in a network range
uv run sfacts discover --subnet 192.168.1.0/24
# Save results to specific directory
uv run sfacts discover --subnet 192.168.1.0/24 --output-dir ./network_scans
# High-performance scanning with custom output
uv run sfacts discover --subnet 10.0.0.0/16 --workers 50 --timeout 15 --output-dir ./results
SSH Tunnel Mode (for testing/containerlab)
# Connect through SSH tunnels
uv run sfacts discover --tunnel-host localhost --tunnel-ports 2101-2116
5. Collect Device Facts
# Collect facts from discovered devices
uv run sfacts facts --input disco_results_*.json
# Save facts to specific directory
uv run sfacts facts --input disco_results.json --output-dir ./network_facts
# Direct subnet fact collection with custom output
uv run sfacts facts --subnet 192.168.1.0/24 --workers 50 --output-dir ./results
6. NetBox Integration (Optional)
Test NetBox Connection
# Test API connectivity
uv run sfacts netbox test
# Check NetBox instance status and data statistics
uv run sfacts netbox status
Synchronize to NetBox IPAM/DCIM
# Dry-run to preview changes
uv run sfacts netbox sync --site-name "Production" --dry-run
# Sync device facts to NetBox
uv run sfacts netbox sync --site-name "Production"
# Validate facts data before sync
uv run sfacts netbox validate
NetBox Data Management
# Preview what would be deleted (SAFE)
uv run sfacts netbox purge --dry-run
# Delete all NetBox data (DESTRUCTIVE - requires confirmation)
uv run sfacts netbox purge
# Delete specific site data only
uv run sfacts netbox purge --site-name "Lab Environment" --dry-run
6b. Nautobot Integration (alternative to NetBox)
The Nautobot backend mirrors the NetBox surface area 1:1 and uses the official
pynautobot SDK end-to-end. Tested
against Nautobot 3.x.
Set credentials via environment (or .env):
export NAUTOBOT_URL=http://localhost:8080
export NAUTOBOT_TOKEN=your-api-token
Run a local Nautobot for development
A turnkey Nautobot 3.1 lab is provided under labs/nautobot/:
docker compose -f labs/nautobot/docker-compose.yml up -d
./labs/nautobot/bootstrap.sh # creates admin/admin + API token
uv run sfacts nautobot test
Synchronize to Nautobot
# Test connection / inspect statistics
uv run sfacts nautobot test
uv run sfacts nautobot status
# Dry-run preview
uv run sfacts nautobot sync --site-name "Production" --dry-run
# Real sync (standard mode)
uv run sfacts nautobot sync --site-name "Production"
# Bulk mode (10-50x faster, optional --adaptive batching)
uv run sfacts nautobot sync --site-name "Production" --bulk
Nautobot data management
# Preview what would be deleted
uv run sfacts nautobot purge --dry-run
# Scoped purge (one Location and its dependents)
uv run sfacts nautobot purge --site-name "Lab Environment"
# Full purge (DESTRUCTIVE; also removes the auto-created LocationType)
uv run sfacts nautobot purge
Schema notes (Nautobot 3.x): Sites are modeled as Locations (under a
LocationTypenamedSitethat we auto-provision); device roles live in the genericextras.rolesregistry; prefixes/IPs live in a Namespace (defaultGlobal); IPโInterface assignments use theipam.ip_address_to_interfacejoin model; and Nautobot stores MAC addresses directly onInterface.mac_address(no separate MAC repository).
7. Prefect Workflow Orchestration (Optional)
Install Prefect workflows
uv sync --extra workflows
Start Prefect server and deployments
# Terminal 1: Start Prefect server
prefect server start
# Terminal 2: Start deployments
uv run python serve.py
Access Prefect UI
- Open http://localhost:4200
- Run deployments from the UI
Seven deployments available:
Pipeline:
- discover-and-collect โ Main entry point: discover + collect facts as subflows (one click)
- discover-only โ Discovery only (re-runs, topology changes)
- collect-only โ Fact collection only (uses latest
devices-jsonartifact) - sync-to-netbox โ Push facts to NetBox IPAM. Mandatory preflight + postflight with before/after diff
- sync-to-nautobot โ Push facts to Nautobot IPAM/DCIM. Mandatory preflight + postflight with before/after diff
Admin:
6. purge-netbox โ โ ๏ธ Destructive NetBox cleanup. dry_run=True by default. Mandatory preflight + postflight
7. purge-nautobot โ โ ๏ธ Destructive Nautobot cleanup. dry_run=True by default. Mandatory preflight + postflight
Example workflow:
# Run from Prefect UI (http://localhost:4200):
# 1. discover-and-collect with target: "192.168.121.0/24"
# โ produces devices-json + facts-json artifacts
# 2. sync-to-netbox (automatically uses latest facts)
Benefits:
- โ Single-click pipeline (discover + collect combined)
- โ Zero manual JSON copying between steps
- โ Before/after NetBox object-count diff on every sync and purge
- โ Preflight abort if NetBox is unreachable โ no blind writes
- โ
Individual steps (
discover-only,collect-only) for targeted re-runs
See Prefect Modular Workflows for complete documentation.
Documentation
- Usage Guide - Detailed usage instructions and examples
- NetBox Integration - Complete NetBox IPAM/DCIM integration guide
- NetBox Dependency Map - NetBox sync system architecture and data flow
- Prefect Integration - Legacy Prefect flows documentation
- Prefect Modular Workflows - New modular deployment architecture
- CI/CD Pipeline - GitLab CI/CD pipeline setup and configuration
- Inventory Structure - Complete inventory configuration reference
- Troubleshooting - Common issues and solutions
Output Structure
output/
โโโ disco_results_YYYYMMDD_HHMMSS.json # Network discovery results
โโโ facts_results_YYYYMMDD_HHMMSS.json # Device facts with TextFSM parsing
โโโ report/
โโโ network_report_YYYYMMDD_HHMMSS.md # Comprehensive network report
Environment Variables
The following environment variables are supported:
Device Authentication:
NETOPS_USERNAME- SSH username for device authenticationNETOPS_PASSWORD- SSH password for device authenticationNETOPS_SECRET- Enable secret for privileged access (optional)
NetBox Integration:
NETBOX_URL- NetBox instance URL (e.g., https://netbox.company.com)NETBOX_TOKEN- NetBox API token for authenticationNETBOX_VERIFY_SSL- SSL certificate verification (default: true)
Development
Code Quality Tools
This project uses modern Python development tools:
# Install development dependencies
uv sync --dev
# Run linting and formatting
uv run ruff check sfacts/ tests/
uv run ruff check --fix sfacts/ tests/
uv run ruff format sfacts/ tests/
# Type checking
uv run mypy sfacts/
# Security scanning
uv run bandit -r sfacts/
# Run tests with coverage
uv run pytest
CI/CD Pipeline
The project includes a GitLab CI/CD pipeline (.gitlab-ci.yml) with four stages:
- Lint - Ruff linting, formatting checks, MyPy type checking
- Test - Unit tests with coverage reporting
- Security - Dependency vulnerability scanning, Bandit security analysis, code quality metrics
- Build - Python package creation (main branch and tags)
See CI/CD Pipeline for full setup details.
Key Technologies
- uv - Fast Python package management
- Prefect - Modern workflow orchestration and scheduling
- Loguru - Structured logging with colors and context
- Keyring - Secure OS-native credential storage
- python-dotenv - Environment variable management from .env files
- Netmiko - Multi-vendor SSH device connections
- TextFSM - Structured parsing with ntc-templates
- pynetbox - NetBox API client for IPAM/DCIM integration
- Rich - Beautiful progress bars and terminal output
Additional Features
Nornir Inventory Export
Export collected facts to Nornir inventory format for integration with Nornir-based automation:
# Export latest facts to Nornir inventory
uv run sfacts export
# Export specific facts file
uv run sfacts export --input-file facts_results_20250124.json
# Include credentials in defaults.yaml
uv run sfacts export --include-credentials
See inventory/README.md for details on using exported inventory files.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Built and maintained by NETODATA โ network automation, done right.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sfacts-2.5.0.tar.gz.
File metadata
- Download URL: sfacts-2.5.0.tar.gz
- Upload date:
- Size: 375.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
436944420227cb3dab15595e9ce2f8219a0ba8b9c75ab232e92b33d5da06b226
|
|
| MD5 |
49e321200a4ce0439c2c26944e4c9e93
|
|
| BLAKE2b-256 |
0b2da4ade6086abd4d0ac8a9d23ef8f28dfced2df6c680c9dbf1e7dca8507be1
|
File details
Details for the file sfacts-2.5.0-py3-none-any.whl.
File metadata
- Download URL: sfacts-2.5.0-py3-none-any.whl
- Upload date:
- Size: 121.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
726b2605aa59380321ab8a5a27cf7ad1b29a2a4edcf1b1efb5de40dcec27e562
|
|
| MD5 |
9602c002e07a442d6f6490f65298e653
|
|
| BLAKE2b-256 |
c7d289f1c5b11e1d04f5dfa76d9bc839d4ec967c68587b9233d3b01b34add927
|