Skip to main content

Efficient and user-friendly point cloud data loader for the NCLT Dataset, supporting multiple coordinate systems and numpy compatibility.

Project description

NCLT Point Cloud Data Loader

This is an English translation of the original Chinese documentation (README_CN.md).

Dataset Homepage: NCLT Dataset

Project Overview

This project provides an efficient and easy-to-use point cloud data loader for the NCLT dataset. It supports multiple coordinate frame conversions, iteration, slicing, and is suitable for autonomous driving, SLAM, 3D reconstruction, and related scenarios.

Features

  • Multiple Coordinate Frames Supported
    • BASE_LINK: Vehicle local coordinate frame
    • MAP: Global map coordinate frame
    • MAP_FRD: Global FRD coordinate frame
    • WORLD: World coordinate frame
    • WORLD_NED: NED world coordinate frame
  • Iterator & Slicing: Supports iteration, slicing, and random access
  • Numpy Compatible: Directly reads from numpy arrays
  • Dataset Length: Supports the len() method

Installation

It is recommended to use uv or pip to install dependencies:

uv pip install -e .

Or use pip directly:

pip install nclt-dataloader

Quick Start

from nclt_dataloader import PointCloudLoader, FrameID

loader = PointCloudLoader("/path/to/nclt/2012-01-08", frame_id=FrameID.MAP)
pc, pose = loader[0]  # Read the first point cloud and its pose

# Iterate over the first 10 frames
for pc, pose in loader[:10]:
    # Process point cloud and pose
    pass

# Get dataset length
print(len(loader))

C++/pybind11 Example

This project supports direct invocation of the Python loader from C++ via pybind11, suitable for integration with C++ projects.
Sample code is provided in example/cpp/main.cc. Make sure the root path points to a valid NCLT dataset directory.

Build & Run Steps

  1. Activate Python virtual environment (ensure nclt-dataloader and pybind11 are installed)
  2. Edit line 18 in main.cc to set the root variable to your dataset path, e.g. /ws/data/nclt/2012-05-26
  3. Build the project:
    cd example/cpp
    cmake -S . -B build
    cmake --build build
    
  4. Run the example:
    ./build/embed_loader
    

The output will show the dataset length and the shapes of the point cloud and pose.

Main Code Logic

  • Start the Python interpreter
    #include <pybind11/embed.h>
    namespace py = pybind11;
    py::scoped_interpreter guard{};
    
  • Import the nclt_dataloader module
    py::module_ kd = py::module_::import("nclt_dataloader");
    py::object PointCloudLoader = kd.attr("PointCloudLoader");
    py::object FrameID = kd.attr("FrameID");
    
  • Create a PointCloudLoader instance
    const char* root = "/ws/data/nclt/2012-05-26";
    py::object loader =
      PointCloudLoader(root, py::arg("frame_id") = FrameID.attr("MAP"));
    
  • Get dataset length
    std::size_t n = loader.attr("__len__")().cast<std::size_t>();
    std::cout << "dataset length = " << n << "\n";
    
  • Read point cloud and pose, print their shapes
    py::tuple item0 = loader[py::int_(0)];
    py::array_t<float> cld = item0[0].cast<py::array_t<float>>();
    py::array_t<double> pose = item0[1].cast<py::array_t<double>>();
    std::cout << "points shape = (" << cld.shape(0) << ", " << cld.shape(1) << ")\n";
    std::cout << "pose shape   = (" << pose.shape(0) << ", " << pose.shape(1) << ")\n";
    
  • Iterate over the first 10 frames (optional)
    for (py::size_t i = 0; i < 10 && i < n; ++i) {
      py::tuple it = loader[py::int_(i)];
      // You can process point cloud and pose here
    }
    

See example/cpp/main.cc for details.

Coordinate Frame Description

  • BASE_LINK: Vehicle local coordinate frame, point cloud transformed by calibration matrix
  • MAP: Global map coordinate frame, point cloud transformed by global pose
  • MAP_FRD: Global FRD coordinate frame
  • WORLD: World coordinate frame
  • WORLD_NED: NED world coordinate frame

You can switch coordinate frames via the frame_id parameter or property:

loader.frame_id = FrameID.WORLD
pc, pose = loader[0]

API Reference

Data Format

  • Point cloud data shape: [N, 5], representing [x, y, z, intensity, label]
  • Pose matrix shape: [4, 4]

Dependencies

  • numpy

Development/testing dependencies (see pyproject.toml):

  • pypcd4
  • matplotlib
  • pytest
  • ipykernel
  • ruff

Testing

Unit tests are included. Run:

uv pip install .[dev]
uv run pytest

Changelog

  • v0.1.0: Initial version, basic functionality implemented

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nclt_dataloader-0.1.0.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nclt_dataloader-0.1.0-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file nclt_dataloader-0.1.0.tar.gz.

File metadata

  • Download URL: nclt_dataloader-0.1.0.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.12

File hashes

Hashes for nclt_dataloader-0.1.0.tar.gz
Algorithm Hash digest
SHA256 815ec8026b90a6707cc966bcd17b03802fd5314113104370d3ef630add394b6f
MD5 3c41bef67be04a9150bc1143290e72cc
BLAKE2b-256 05d05cd3422eb7071deb5ef0d9885501649f9ab533e0dfac38f6093e7deb21a1

See more details on using hashes here.

File details

Details for the file nclt_dataloader-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for nclt_dataloader-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7bf11bc983d3801fd8e8cc95ef693dd4f2fc58831efd122472d7712271766889
MD5 bcbe5039f631822648fd4904abc5148b
BLAKE2b-256 c15a13efb6fb96d6a2bc73848bfb39438831fca1cb39a54a37dcb48a3f72a3ae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page