Skip to main content

The open-source control stack for autonomous robots with LiDAR-camera sensor fusion.

Project description

Robot LiDAR Fusion

The latest version of an open‑source control stack for autonomous robots with LiDAR‑camera sensor fusion.
One codebase. Any robot arm. Any LiDAR. Any camera. Any environment.

Python 3.9+ License Code style: black Build Status Coverage


📋 Table of Contents


What Is This?

Robot LiDAR Fusion is a modular, hardware‑agnostic software foundation for building autonomous robots that perceive the world through LiDAR point clouds and camera images. It provides a complete control stack—from raw sensor ingestion to mission planning and locomotion—running in a deterministic loop at configurable frequencies (default 100 Hz).

We built this because integrating LiDAR, cameras, joint controllers, battery management, hazard detection, and navigation into a single coherent system is hard. Every robotics team ends up writing the same glue code. This project provides that glue as a well‑tested, well‑documented open standard so you can focus on what makes your robot unique.


Architecture

The system is orchestrated by a deterministic loop that synchronises all modules every cycle. The diagram below illustrates the core components and their relationships.

flowchart TB
    subgraph Orchestrator["RobotOrchestrator (100 Hz deterministic loop)"]
        direction LR
        A[Cycle Start] --> B[Read Sensors]
        B --> C[Update Managers]
        C --> D[Process Tasks]
        D --> E[Map Instructions]
        E --> F[Sync Actuators]
        F --> G[Verify Consistency]
        G --> A
    end

    subgraph Core["Core Services"]
        H[Hardware Synchronization]
        I[Memory Management]
        J[Hazard Manager]
        K[Fault Detection]
        L[Concurrency Control]
        M[Communication]
        N[Environment Adapter]
        O[Execution Stack]
    end

    subgraph Control["Control"]
        P[Joint Synchronization]
        Q[Locomotion Controller]
    end

    subgraph Perception["Perception"]
        R[LIDAR Utils]
        S[Sensor Frames]
        T[Time Sync]
        U["Sensor I/O (Direct/ROS2)"]
        V[Sensor Processing]
    end

    subgraph Planning["Planning"]
        W[Mission Planner]
        X[Navigation Manager]
        Y[Task-Hardware Mapping]
    end

    subgraph Power["Power Management"]
        Z[Battery Manager]
        AA[Thermal Manager]
    end

    subgraph AI["AI Layer"]
        AB[Predictive State Estimation]
    end

    Orchestrator --- Core
    Orchestrator --- Control
    Orchestrator --- Perception
    Orchestrator --- Planning
    Orchestrator --- Power
    Orchestrator --- AI

Deterministic Control Loop – The RobotOrchestrator runs at a configurable frequency (default 100 Hz) with strict cycle‑time enforcement. Each iteration reads sensors, updates all managers, processes the mission queue, maps high‑level tasks to joint‑level instructions, synchronises actuators, and verifies system consistency.


Key Features

  • Sensor Fusion – Brings together Ouster OS1 LiDAR point clouds and RGB/depth camera frames with sub‑100 ms time synchronisation. The perception pipeline computes obstacle distances, fuses orientation and velocity data, and feeds a unified state representation to the rest of the stack.
  • Hardware Agnostic – The control stack does not depend on any specific robot platform. A MockHardware backend is included for development and testing. Swap it for your robot’s SDK by implementing the hardware interface—everything works unchanged.
  • Safety First – A multi‑signal hazard manager aggregates proximity, voltage, gas, pedestrian, and environmental hazard signals. The fault detector monitors joint positions, velocities, and timestamps for anomalies. The system can trigger emergency stops when safety thresholds are breached.
  • Dual Sensor Ingestion – Supports both ROS2 topic subscription and direct vendor SDK ingestion (Ouster SDK, OpenCV, pyrealsense2). Use ROS2 in production for robust driver support, or direct SDK mode for lightweight testing.
  • Mission Planning and Navigation – Goal‑based mission planner, A*‑ready navigation manager, and task‑to‑hardware mapping that translates high‑level goals into joint‑level instructions through inverse kinematics.
  • Power Management – Tracks battery state‑of‑charge, estimates task energy costs, defers energy‑intensive tasks when reserves are low, and manages thermal profiles with hysteresis‑based cooling control.
  • AI Layer – Predictive state estimation that anticipates near‑future sensor readings to reduce latency and improve control smoothness.

Quick Start

# Clone the repository
git clone https://github.com/iceccarelli/robot-lidar-fusion.git
cd robot-lidar-fusion

# Install in development mode with all extras
pip install -e ".[dev]"

# Run the control loop with mock hardware (50 cycles)
python scripts/run_robot.py --cycles 50

# Run the test suite
pytest -v

# Try a live sensor fusion demo (requires ROS2 or direct SDK)
python examples/demo_os1_camera_live.py

Project Structure

robot-lidar-fusion/
├── robot_hw/                  # Main package
│   ├── core/                  # Foundational services
│   │   ├── communication.py           # Telemetry and inter‑process messaging
│   │   ├── concurrency_management.py  # Named‑lock concurrency control
│   │   ├── consistency_verification.py # State consistency checks
│   │   ├── environment_adapter.py     # Environment‑specific tuning
│   │   ├── execution_stack.py         # Scheduled task execution
│   │   ├── fault_detection.py         # Joint and sensor fault detection
│   │   ├── hardware_synchronization.py # Hardware read/write abstraction
│   │   ├── hazard_manager.py          # Multi‑signal hazard aggregation
│   │   └── memory_management.py       # Deterministic memory allocation
│   ├── control/               # Actuator control
│   │   ├── joint_synchronization.py   # Joint command dispatch with limits
│   │   └── locomotion_controller.py   # Gait generation and velocity control
│   ├── perception/            # Sensor ingestion and fusion
│   │   ├── lidar_utils.py            # Point cloud utilities
│   │   ├── sensor_frames.py          # LiDARFrame and CameraFrame data models
│   │   ├── sensor_io_direct.py       # Direct SDK ingestion (Ouster, OpenCV)
│   │   ├── sensor_io_ros2.py         # ROS2 topic ingestion
│   │   ├── sensor_processing.py      # Multi‑sensor fusion
│   │   └── time_sync.py              # LiDAR‑camera timestamp alignment
│   ├── planning/              # Mission and navigation
│   │   ├── mission_planner.py        # Goal queue and mission sequencing
│   │   ├── navigation_manager.py     # Path planning and obstacle avoidance
│   │   └── task_hardware_mapping.py  # Task‑to‑joint instruction mapping
│   ├── power/                 # Power management
│   │   ├── battery_management.py     # SOC tracking and energy estimation
│   │   └── thermal_management.py     # Per‑joint thermal monitoring
│   ├── ai/                    # AI and prediction
│   │   └── predictive_controller.py  # State estimation and forecasting
│   ├── robot_config.py        # Environment variable configuration
│   ├── robot_orchestrator.py  # Main control loop
│   ├── simulation.py          # Verbose stress test simulation
│   └── stress_simulation.py   # Multi‑environment digital twin
├── calibration/               # Sensor calibration files
│   ├── camera_intrinsics.yaml
│   ├── extrinsics.yaml
│   └── README.md
├── tests/                     # Test suite
├── examples/                  # Working demonstrations
├── scripts/                   # Entry points
├── docs/                      # Documentation
├── enterprise/                # Enterprise extensions (planned)
├── gateway/                   # Fleet management gateway (planned)
└── config/                    # Configuration templates

Supported Sensors

Sensor Interface Status
Ouster OS1‑64/‑128 Direct SDK or ROS2 Supported
Intel RealSense D435/D455 Direct SDK or ROS2 Supported
USB cameras (UVC) OpenCV Supported
Generic IMU Via SensorProcessor Supported
Encoders Via HardwareSynchronizer Supported

Examples

The examples/ directory contains three practical demonstrations designed to accelerate development and illustrate key capabilities:

  • basic_control_loop.py – Runs multiple cycles of the RobotOrchestrator with mock hardware, providing the simplest way to understand and prototype the deterministic control system.
  • sensor_fusion_demo.py – Uses synthetic LiDAR and camera frames to demonstrate timestamp synchronization and real-time minimum forward obstacle distance calculation.
  • demo_os1_camera_live.py – Connects to a real Ouster OS1 LiDAR and camera (ROS2 or direct SDK) for live sensor ingestion and precise time synchronization.

These examples serve as excellent starting points, significantly reducing onboarding time and enabling faster implementation of robust sensor fusion and control applications.


Enterprise & Gateway

The enterprise/ and gateway/ directories provide a roadmap for scaling the project to industrial and fleet-level deployments, offering substantial value for commercial users through enhanced capabilities, compliance, and operational tools.

The enterprise/ directory contains planned extensions for:

  • Certified robot connectors (KUKA, ABB, Fanuc, Universal Robots)
  • Advanced algorithms (EKF fusion, RRT* planning, MPC locomotion, SLAM)
  • ISO compliance modules

The gateway/ directory contains the planned hosted Robot Control Gateway for:

  • Fleet management
  • Telemetry aggregation
  • Remote operation
  • Over‑the‑air updates

See the README files in each directory for details.


Contributing

We welcome contributions from the robotics community. Whether you are fixing a bug, adding a sensor driver, improving documentation, or proposing a new subsystem, your work helps everyone building autonomous robots.

Please read CONTRIBUTING.md before submitting a pull request.


License

This project is released under the Apache License 2.0.


Acknowledgements

This project was created and is maintained by iceccarelli. It draws on years of experience integrating LiDAR, cameras, and control systems for autonomous robots across industrial, research, and field environments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

robot_lidar_fusion-0.2.1.tar.gz (97.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

robot_lidar_fusion-0.2.1-py3-none-any.whl (97.4 kB view details)

Uploaded Python 3

File details

Details for the file robot_lidar_fusion-0.2.1.tar.gz.

File metadata

  • Download URL: robot_lidar_fusion-0.2.1.tar.gz
  • Upload date:
  • Size: 97.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for robot_lidar_fusion-0.2.1.tar.gz
Algorithm Hash digest
SHA256 1978fb30854f7b9f8b02f30d9e0245b1d3dd2ae606722cf91952b19c4ee1ac37
MD5 4ab5983bce8864af56feebe0c254840f
BLAKE2b-256 72849f1891950ce36e0b5e61e07b7fdd3171d31fbd686e508b9adf772c2a2b73

See more details on using hashes here.

Provenance

The following attestation bundles were made for robot_lidar_fusion-0.2.1.tar.gz:

Publisher: pypi-publish.yml on iceccarelli/robot-lidar-fusion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file robot_lidar_fusion-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for robot_lidar_fusion-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a9303dc93bd8c551862592368abcec6d7b5aa1242011971e2813c7f33b6aea32
MD5 71220c8e1e56b28697d7769da5ae25da
BLAKE2b-256 2e2e23a9e1f01cdff65112cae9e9eadd3f55c72d52f74a640004d7b97244081a

See more details on using hashes here.

Provenance

The following attestation bundles were made for robot_lidar_fusion-0.2.1-py3-none-any.whl:

Publisher: pypi-publish.yml on iceccarelli/robot-lidar-fusion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page