Efficient and user-friendly point cloud data loader for the NCLT Dataset, supporting multiple coordinate systems and numpy compatibility.
Project description
NCLT Point Cloud Data Loader
This is an English translation of the original Chinese documentation (
README_CN.md).
Dataset Homepage: NCLT Dataset
Project Overview
This project provides an efficient and easy-to-use point cloud data loader for the NCLT dataset. It supports multiple coordinate frame conversions, iteration, slicing, and is suitable for autonomous driving, SLAM, 3D reconstruction, and related scenarios.
Features
- Multiple Coordinate Frames Supported
BASE_LINK: Vehicle local coordinate frameMAP: Global map coordinate frameMAP_FRD: Global FRD coordinate frameWORLD: World coordinate frameWORLD_NED: NED world coordinate frame
- Iterator & Slicing: Supports iteration, slicing, and random access
- Numpy Compatible: Directly reads from numpy arrays
- Dataset Length: Supports the
len()method
Installation
It is recommended to use uv or pip to install dependencies:
uv pip install -e .
Or use pip directly:
pip install nclt-dataloader
Quick Start
from nclt_dataloader import PointCloudLoader, FrameID
loader = PointCloudLoader("/path/to/nclt/2012-01-08", frame_id=FrameID.MAP)
pc, pose = loader[0] # Read the first point cloud and its pose
# Iterate over the first 10 frames
for pc, pose in loader[:10]:
# Process point cloud and pose
pass
# Get dataset length
print(len(loader))
C++/pybind11 Example
This project supports direct invocation of the Python loader from C++ via pybind11, suitable for integration with C++ projects.
Sample code is provided in example/cpp/main.cc. Make sure the root path points to a valid NCLT dataset directory.
Build & Run Steps
- Activate Python virtual environment (ensure
nclt-dataloaderandpybind11are installed) - Edit line 18 in
main.ccto set therootvariable to your dataset path, e.g./ws/data/nclt/2012-05-26 - Build the project:
cd example/cpp cmake -S . -B build cmake --build build
- Run the example:
./build/embed_loader
The output will show the dataset length and the shapes of the point cloud and pose.
Main Code Logic
- Start the Python interpreter
#include <pybind11/embed.h> namespace py = pybind11; py::scoped_interpreter guard{};
- Import the
nclt_dataloadermodulepy::module_ kd = py::module_::import("nclt_dataloader"); py::object PointCloudLoader = kd.attr("PointCloudLoader"); py::object FrameID = kd.attr("FrameID");
- Create a
PointCloudLoaderinstanceconst char* root = "/ws/data/nclt/2012-05-26"; py::object loader = PointCloudLoader(root, py::arg("frame_id") = FrameID.attr("MAP"));
- Get dataset length
std::size_t n = loader.attr("__len__")().cast<std::size_t>(); std::cout << "dataset length = " << n << "\n";
- Read point cloud and pose, print their shapes
py::tuple item0 = loader[py::int_(0)]; py::array_t<float> cld = item0[0].cast<py::array_t<float>>(); py::array_t<double> pose = item0[1].cast<py::array_t<double>>(); std::cout << "points shape = (" << cld.shape(0) << ", " << cld.shape(1) << ")\n"; std::cout << "pose shape = (" << pose.shape(0) << ", " << pose.shape(1) << ")\n";
- Iterate over the first 10 frames (optional)
for (py::size_t i = 0; i < 10 && i < n; ++i) { py::tuple it = loader[py::int_(i)]; // You can process point cloud and pose here }
See example/cpp/main.cc for details.
Coordinate Frame Description
BASE_LINK: Vehicle local coordinate frame, point cloud transformed by calibration matrixMAP: Global map coordinate frame, point cloud transformed by global poseMAP_FRD: Global FRD coordinate frameWORLD: World coordinate frameWORLD_NED: NED world coordinate frame
You can switch coordinate frames via the frame_id parameter or property:
loader.frame_id = FrameID.WORLD
pc, pose = loader[0]
API Reference
PointCloudLoader: Main loader class, supports indexing, slicing, iteration, etc.FrameID: Coordinate frame enum typeget_lidar_files: Get point cloud file listread_lidar: Read a single frame point cloudload_calib_matrix: Read calibration matrixload_global_poses: Read global posestransform_point_cloud: Point cloud coordinate transformation
Data Format
- Point cloud data shape:
[N, 5], representing[x, y, z, intensity, label] - Pose matrix shape:
[4, 4]
Dependencies
- numpy
Development/testing dependencies (see pyproject.toml):
- pypcd4
- matplotlib
- pytest
- ipykernel
- ruff
Testing
Unit tests are included. Run:
uv pip install .[dev]
uv run pytest
Changelog
- v0.1.0: Initial version, basic functionality implemented
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nclt_dataloader-0.1.0.tar.gz.
File metadata
- Download URL: nclt_dataloader-0.1.0.tar.gz
- Upload date:
- Size: 7.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
815ec8026b90a6707cc966bcd17b03802fd5314113104370d3ef630add394b6f
|
|
| MD5 |
3c41bef67be04a9150bc1143290e72cc
|
|
| BLAKE2b-256 |
05d05cd3422eb7071deb5ef0d9885501649f9ab533e0dfac38f6093e7deb21a1
|
File details
Details for the file nclt_dataloader-0.1.0-py3-none-any.whl.
File metadata
- Download URL: nclt_dataloader-0.1.0-py3-none-any.whl
- Upload date:
- Size: 8.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7bf11bc983d3801fd8e8cc95ef693dd4f2fc58831efd122472d7712271766889
|
|
| MD5 |
bcbe5039f631822648fd4904abc5148b
|
|
| BLAKE2b-256 |
c15a13efb6fb96d6a2bc73848bfb39438831fca1cb39a54a37dcb48a3f72a3ae
|