Skip to main content

Evaluate Lidar-Inertial Odometry on public datasets

Project description

evalio

evalio is a tool for Evaluating Lidar-Inertial Odometry.

Specifically, it provides a common interface for connecting LIO datasets and LIO pipelines. This allows for easy addition of new datasets and pipelines, as well as a common location to evaluate them making benchmarks significantly easier to run. It features,

  • No ROS dependency! (though it can still load rosbag datasets using the wonderful rosbags package)
  • Easy to add new datasets and pipelines, see the example
  • Unified representation of lidar scan, e.g. row (scan-line) major order, stamped at the start of the scan, point stamps are relative from the start of the scan.
  • Download and manage datasets via the CLI interface
  • Simple to use API for friction-free access to data
  • Run pipelines via the CLI interface and yaml config files
  • Compute statistics for resulting trajectory runs

Installation

evalio is available on PyPi, so simply install via your favorite python package manager,

uv add evalio      # uv
pip install evalio # pip

Usage

evalio can be used both as a python library and as a CLI for both datasets and pipelines.

Datasets

Once evalio is installed, datasets can be listed and downloaded via the CLI interface. For example, to list all datasets and then download a sequence from the hilti-2022 dataset,

evalio ls datasets
evalio download hilti_2022/basement_2

evalio downloads data to the EVALIO_DATA environment variable, or if unset to the local folder ./evalio_data. All the trajectories in a dataset can also be downloaded by using the wildcard hilti_2022/*, making sure to escape the asterisk as needed.

[!NOTE] Many datasets use gdown to download datasets from google drive. Unfortunately, this can occasionally be finicky due to google's download limits, however downloading cookies from your browser can often help.

Once downloaded, a trajectory can then be easily used in python,

from evalio.datasets import Hilti2022

# for all data
for mm in Hilti2022.basement_2:
    print(mm)

# for lidars
for scan in Hilti2022.basement_2.lidar():
    print(scan)

# for imu
for imu in Hilti2022.basement_2.imu():
    print(imu)

For example, you can easily get a single scan to plot a bird-eye view,

import matplotlib.pyplot as plt
import numpy as np

# get the 10th scan
scan = Hilti2022.basement_2.get_one_lidar(10)
# always in row-major order, with stamp at start of scan
x = np.array([p.x for p in scan.points])
y = np.array([p.y for p in scan.points])
z = np.array([p.z for p in scan.points])
plt.scatter(x, y, c=z, s=1)
plt.axis('equal')
plt.show()

evalio also comes with a built wrapper for converting to rerun types,

import rerun as rr
from evalio.rerun import convert

rr.init("evalio")
rr.connect_tcp()
for scan in Hilti2022.basement_2.lidar():
    rr.set_time_seconds("timeline", seconds=scan.stamp.to_sec())
    rr.log("lidar", convert(scan, color=[255, 0, 255]))

[!NOTE]
To run the rerun visualization, rerun must be installed. This can be done by installing rerun-sdk or evalio[vis] from PyPi.

We recommend checking out the base dataset class for more information on how to interact with datasets.

Pipelines

The other half of evalio is the pipelines that can be run on various datasets. All pipelines and their parameters can be shown via,

evalio ls pipelines

For example, to run KissICP on a dataset,

evalio run -o results -d hilti_2022/basement_2 -p kiss

This will run the pipeline on the dataset and save the results to the results folder. The results can then be used to compute statistics on the trajectory,

evalio stats results

[!NOTE]
KissICP does poorly by default on hilti_2022/basement_2, due to the close range and large default voxel size. You can visualize this by adding -vvv to the run command to visualize the trajectory in rerun.

More complex experiments can be run, including varying pipeline parameters, via specifying a config file,

output_dir: ./results/

datasets:
  # Run on all of newer college trajectories
  - hilti_2022/*
  # Run on first 1000 scans of multi campus
  - name: multi_campus/ntu_day_01
    length: 1000

pipelines:
  # Run vanilla kiss with default parameters
  - kiss
  # Tweak kiss parameters
  - name: kiss_tweaked
    pipeline: kiss
    deskew: true
    # Some of these datasets need smaller voxel sizes
    sweep:
      voxel_size: [0.1, 0.5, 1.0]
      

This can then be run via

evalio run -c config.yml

That's about the gist of it! Try playing around the CLI interface to see what else is possible, such as a number of visualization options using rerun. Feel free to open an issue if you have any questions, suggestions, or problems.

It should also be mentioned, autocomplete can be installed via argcomplete,

evalio "$(register-python-argcomplete evalio)"

This is extra useful for specifying the datasets when downloading or running, as they can get particularly long.

Custom Datasets & Pipelines

We understand that using an internal or work-in-progress datasets and pipelines will often be needed, thus evalio has full support for this. As mentioned above, we recommend checking out our example for more information how to to do this (it's pretty easy!).

The TL;DR version, a custom dataset can be made via inheriting from the Dataset class in python only, and a custom pipeline from inheriting the Pipeline class in either C++ or python. These can then be made available to evalio via the EVALIO_CUSTOM env variable point to the python module that contains them.

We highly recommend making a PR to merge your custom datasets or pipelines into evalio once they are ready. This will make it more likely the community will use and cite your work, as well as increase the usefulness of evalio for everyone.

Building from Source

While we recommend simply installing the python package using your preferred python package manager (our is uv), we've attempted to make building from source as easy as possible. We generally build through scikit-core-build which provides a simple wrapper for building CMake projects as python packages. uv is our frontend of choice for this process, but it is also possible via pip

uv sync          # uv version
pip install -e . # pip version

Of course, building via the usual CMake way is also possible, with the only default dependency being Eigen3,

mkdir build
cd build
cmake ..
make

By default, all pipelines are not included due to their large dependencies. CMake will look for them in the cpp/bindings/pipelines-src directory. If you'd like to add them, simply run the clone_pipelines.sh script that will clone and patch them appropriately.

When these pipelines are included, the number of dependencies increases significantly, so have provided a docker image that includes all dependencies for building as well as a VSCode devcontainer configuration. When opening in VSCode, you'll automatically be prompted to open in this container.

Contributing

Contributions are always welcome! Feel free to open an issue, pull request, etc. We're happy to help you get started. The following are rough instructions for specifically adding additional datasets or pipelines.

Datasets

Datasets are easy to add, simply drop your file into the python/evalio/datasets folder, and add it into the init file.

Pipelines

If adding in a python pipeline, it's near identical to adding a dataset. Drop your file into the python/evalio/pipelines folder, and add it into the init file.

C++ pipelines are more involved (and probably worth the effort). Your header file belongs in the cpp/bindings/pipelines folder. To get it to build, make sure it's added to clone_pipelines.sh, the proper CMakeLists.txt, and the [bindings.h] header. Finally, make sure all dependencies are also added to the docker build script, found in the docker folder.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

evalio-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

evalio-0.1.0-cp313-cp313-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.13macOS 11.0+ x86-64

evalio-0.1.0-cp313-cp313-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

evalio-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

evalio-0.1.0-cp312-cp312-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.12macOS 11.0+ x86-64

evalio-0.1.0-cp312-cp312-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

evalio-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

evalio-0.1.0-cp311-cp311-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.11macOS 11.0+ x86-64

evalio-0.1.0-cp311-cp311-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

File details

Details for the file evalio-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f6c809e3f05df04edd3b5095dc69942361bc3a6f74078016906029e5533034c8
MD5 cea9613c2d0dfd6cc5b86dbe0cfc0dd1
BLAKE2b-256 715017261f1031c0337778229c958b2b8a8ad304d9853b40d1be35d23b77c86f

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp313-cp313-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp313-cp313-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 f3cb0733fe6978b6ef211a25b2c4690f983d3dec1e90210f1caff121c7da4a9b
MD5 dc9144c1eefaefee57f7cc920ba946ab
BLAKE2b-256 c9f74eec20351bbedb8edf89845c4c8333c6e0732c6854a7c7ade04374fc2749

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp313-cp313-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 91919114d40cef7ca710bab550d9fcef50ad9d1403fdf3bbd3a66801bb809b5d
MD5 2efc464f2827812f1499264246f91950
BLAKE2b-256 456b0293d5b2d49e23488f5c72091a1af166e39e0dbfeb13bd20935adf33edfd

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 941ec2e4d9ec270a6aed9ae0f9dd0e0a79fac7b85d4da779544b74273d9ac825
MD5 b663f7c376bbc3ce8688a92a9fb5eeba
BLAKE2b-256 779185b44b7a59b1aa35b7fb259b58e7cf5638b2dbb89f0159350a21dc0b577d

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 677499e69333012eb11562bee7545877c6edc9ebaaafdaa8dba60bae283a8fe3
MD5 ec480b207c072752e9af69d6d0e745a8
BLAKE2b-256 9bc71990e774a34c40cdb7da324b4f188bef07b1d4c5dc59df6b0a0b8b09c20f

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp312-cp312-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bab91c4c4094568a436e8437cca4536d0c26cbbe3ae4f17b4c855fafe76d5b16
MD5 6c701503896344dc537d0e978b659853
BLAKE2b-256 20c56ce8d9d4d4843fbc8832f0a23b7ff019d278ec74cab6aa65fc7de62269f3

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f633db27dee2e27a4e4ced1075be194fd5a3fb3d9e89acb32d8798cb4d8658f5
MD5 1a8e13a7a6721f11d45eb4670671ac00
BLAKE2b-256 1cc5ded3904121934226be49afc08387926ba046118aed742174ed88dec6e796

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp311-cp311-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp311-cp311-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 6d23506f0c86bacaacff5f7780cefebc2cdb4a702fe7362fe0c9bb778b5c8252
MD5 4db227031037701412dd31da87c68b37
BLAKE2b-256 b4f46774d35e8f92f295ac6304e11f1ecafc96d6538a72fec47c1d48c00f7401

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp311-cp311-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 be04d361ab1601b4040f592546755481f16c2d3e7af8f90f33cad91d33fe9a06
MD5 1ff33e004e28f58ade75ca17e3783716
BLAKE2b-256 13b886354ac8a2bf1d6872a78308082e6dd89bcfe3a513fb4cac0a79b6bcaca2

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.0-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page