Skip to main content

Evaluate Lidar-Inertial Odometry on public datasets

Project description

evalio

evalio is a tool for Evaluating Lidar-Inertial Odometry.

Specifically, it provides a common interface for connecting LIO datasets and LIO pipelines. This allows for easy addition of new datasets and pipelines, as well as a common location to evaluate them making benchmarks significantly easier to run. It features,

  • No ROS dependency! (though it can still load rosbag datasets using the wonderful rosbags package)
  • Easy to add new datasets and pipelines, see the example
  • Unified representation of lidar scan, e.g. row (scan-line) major order, stamped at the start of the scan, point stamps are relative from the start of the scan.
  • Download and manage datasets via the CLI interface
  • Simple to use API for friction-free access to data
  • Run pipelines via the CLI interface and yaml config files
  • Compute statistics for resulting trajectory runs

Installation

evalio is available on PyPi, so simply install via your favorite python package manager,

uv add evalio      # uv
pip install evalio # pip

Usage

evalio can be used both as a python library and as a CLI for both datasets and pipelines.

Datasets

Once evalio is installed, datasets can be listed and downloaded via the CLI interface. For example, to list all datasets and then download a sequence from the hilti-2022 dataset,

evalio ls datasets
evalio download hilti_2022/basement_2

evalio downloads data to the EVALIO_DATA environment variable, or if unset to the local folder ./evalio_data. All the trajectories in a dataset can also be downloaded by using the wildcard hilti_2022/*, making sure to escape the asterisk as needed.

[!TIP] evalio also comes with autocomplete, which makes typing the long dataset and pipeline names much easier. To install, do one of the following,

eval "$(evalio --show-completion)" # install for the current session
evalio --install-completion        # install for all future sessions

[!NOTE] Many datasets use gdown to download datasets from google drive. Unfortunately, this can occasionally be finicky due to google's download limits, however downloading cookies from your browser can often help.

Once downloaded, a trajectory can then be easily used in python,

from evalio.datasets import Hilti2022

# for all data
for mm in Hilti2022.basement_2:
    print(mm)

# for lidars
for scan in Hilti2022.basement_2.lidar():
    print(scan)

# for imu
for imu in Hilti2022.basement_2.imu():
    print(imu)

For example, you can easily get a single scan to plot a bird-eye view,

import matplotlib.pyplot as plt
import numpy as np

# get the 10th scan
scan = Hilti2022.basement_2.get_one_lidar(10)
# always in row-major order, with stamp at start of scan
x = np.array([p.x for p in scan.points])
y = np.array([p.y for p in scan.points])
z = np.array([p.z for p in scan.points])
plt.scatter(x, y, c=z, s=1)
plt.axis('equal')
plt.show()

evalio also comes with a built wrapper for converting to rerun types,

import rerun as rr
from evalio.rerun import convert

rr.init("evalio")
rr.connect_tcp()
for scan in Hilti2022.basement_2.lidar():
    rr.set_time_seconds("timeline", seconds=scan.stamp.to_sec())
    rr.log("lidar", convert(scan, color=[255, 0, 255]))

[!NOTE]
To run the rerun visualization, rerun must be installed. This can be done by installing rerun-sdk or evalio[vis] from PyPi.

We recommend checking out the base dataset class for more information on how to interact with datasets.

Pipelines

The other half of evalio is the pipelines that can be run on various datasets. All pipelines and their parameters can be shown via,

evalio ls pipelines

For example, to run KissICP on a dataset,

evalio run -o results -d hilti_2022/basement_2 -p kiss

This will run the pipeline on the dataset and save the results to the results folder. The results can then be used to compute statistics on the trajectory,

evalio stats results

[!NOTE]
KissICP does poorly by default on hilti_2022/basement_2, due to the close range and large default voxel size. You can visualize this by adding -vvv to the run command to visualize the trajectory in rerun.

More complex experiments can be run, including varying pipeline parameters, via specifying a config file,

output_dir: ./results/

datasets:
  # Run on all of newer college trajectories
  - hilti_2022/*
  # Run on first 1000 scans of multi campus
  - name: multi_campus/ntu_day_01
    length: 1000

pipelines:
  # Run vanilla kiss with default parameters
  - kiss
  # Tweak kiss parameters
  - name: kiss_tweaked
    pipeline: kiss
    deskew: true
    # Some of these datasets need smaller voxel sizes
    sweep:
      voxel_size: [0.1, 0.5, 1.0]
      

This can then be run via

evalio run -c config.yml

That's about the gist of it! Try playing around the CLI interface to see what else is possible, such as a number of visualization options using rerun. Feel free to open an issue if you have any questions, suggestions, or problems.

Custom Datasets & Pipelines

We understand that using an internal or work-in-progress datasets and pipelines will often be needed, thus evalio has full support for this. As mentioned above, we recommend checking out our example for more information how to to do this (it's pretty easy!).

The TL;DR version, a custom dataset can be made via inheriting from the Dataset class in python only, and a custom pipeline from inheriting the Pipeline class in either C++ or python. These can then be made available to evalio via the EVALIO_CUSTOM env variable point to the python module that contains them.

We highly recommend making a PR to merge your custom datasets or pipelines into evalio once they are ready. This will make it more likely the community will use and cite your work, as well as increase the usefulness of evalio for everyone.

Building from Source

While we recommend simply installing the python package using your preferred python package manager (our is uv), we've attempted to make building from source as easy as possible. We generally build through scikit-core-build which provides a simple wrapper for building CMake projects as python packages. uv is our frontend of choice for this process, but it is also possible via pip

uv sync          # uv version
pip install -e . # pip version

Of course, building via the usual CMake way is also possible, with the only default dependency being Eigen3,

mkdir build
cd build
cmake ..
make

By default, all pipelines are not included due to their large dependencies. CMake will look for them in the cpp/bindings/pipelines-src directory. If you'd like to add them, simply run the clone_pipelines.sh script that will clone and patch them appropriately.

When these pipelines are included, the number of dependencies increases significantly, so have provided a docker image that includes all dependencies for building as well as a VSCode devcontainer configuration. When opening in VSCode, you'll automatically be prompted to open in this container.

Contributing

Contributions are always welcome! Feel free to open an issue, pull request, etc. We're happy to help you get started. The following are rough instructions for specifically adding additional datasets or pipelines.

Datasets

Datasets are easy to add, simply drop your file into the python/evalio/datasets folder, and add it into the init file.

Pipelines

If adding in a python pipeline, it's near identical to adding a dataset. Drop your file into the python/evalio/pipelines folder, and add it into the init file.

C++ pipelines are more involved (and probably worth the effort). Your header file belongs in the cpp/bindings/pipelines folder. To get it to build, make sure it's added to clone_pipelines.sh, the proper CMakeLists.txt, and the [bindings.h] header. Finally, make sure all dependencies are also added to the docker build script, found in the docker folder.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

evalio-0.3.0-cp313-cp313-manylinux_2_28_x86_64.whl (2.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

evalio-0.3.0-cp313-cp313-macosx_11_0_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.13macOS 11.0+ x86-64

evalio-0.3.0-cp313-cp313-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

evalio-0.3.0-cp312-cp312-manylinux_2_28_x86_64.whl (2.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

evalio-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.12macOS 11.0+ x86-64

evalio-0.3.0-cp312-cp312-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

evalio-0.3.0-cp311-cp311-manylinux_2_28_x86_64.whl (2.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

evalio-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.11macOS 11.0+ x86-64

evalio-0.3.0-cp311-cp311-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

File details

Details for the file evalio-0.3.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8837002c025bc6edddc7cb346303b67e2276edb41c78402865b54bbcdb992691
MD5 f39fa0970a0d904586134f5cbcd9cc27
BLAKE2b-256 4a3a038ed336c15668f063dbe1c19b114aaa4fbb28bc3b6d0ad20ed43616aac1

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp313-cp313-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp313-cp313-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 fd9a28a654242eaf0cc85671b2045106502cc003fd9efe99117984a77b83d741
MD5 cb020a65ee29941e33a67421694c0eef
BLAKE2b-256 1da33375d877ae8ae5ccfa790fb726949aeebb73f3c064f4773a1221b6fada46

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp313-cp313-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1f4ccbc50c5596a8c7efce6a9e4ac51fba70e2c2c3fc1ae3ebe8f2687e894642
MD5 5707fb071c4b74b9adddbd671c2adb8b
BLAKE2b-256 f6a8b8298d9821d6b502856a74b3f68ae010787a6c6c3bb91e52ee633122168a

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a3b0a9bd81202097d2135e7190f32eee9c24f9fc38498a2315dc6291b8608566
MD5 3370e13cf65a6994e6db8e9e28984c7f
BLAKE2b-256 f5750f7276bd64e52b2dadd89c384640f16ff1df3c1bc4d777e0cf07fdf23463

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 29242d11372a06f01906eb83042a55a0315b773e0fcb246cdd172560dc5a6061
MD5 aab86513bedf7b9de5f7802030f402a1
BLAKE2b-256 386f68ec1f793c0a12f8021c0e9efccd3afde76a12d2b42850fad1f7d82a6b4b

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 9c828e50dcc88ca2343410cb088092b4d862282f2804c5f14fceebd04d410b4f
MD5 6663bf725fd327b186d00d45059d2375
BLAKE2b-256 e72f1213a010d71306bd61402ef53640686f62de34d73ed5aec2630c35b90cd8

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 0eb2cf09e4d2f9775c627072ea07d6187dbe821296a74dacef5f36e5d4be9294
MD5 0c617dc35e8406a0f04143d44ee951e5
BLAKE2b-256 61689f435e1e9ac9a78d855f38b2382c26e95b30816ae1fc44bed74f70f0a568

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 4e994b39113ff430b4a7ccf381f1f8d3b05d507d3445832a823702018a1179da
MD5 b82eac7498b9929db037152727d93810
BLAKE2b-256 736accd3bd8a7611ad5c493b330fb99cf5c00aa9cf9452230c39d03fbbf1b523

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.3.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.3.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a19fc53f5499829839d849a779969eda2673a8a3f07e21f26358c2d02bf4fe72
MD5 6658b7e5bfa30cb8c13f53fa4135b332
BLAKE2b-256 524fca2ce6fa0f54b86cbd942048d252ca02599abec0c89bf05bf8395ed33911

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.3.0-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page