Skip to main content

Evaluate Lidar-Inertial Odometry on public datasets

Project description

evalio

evalio is a tool for Evaluating Lidar-Inertial Odometry.

Specifically, it provides a common interface for connecting LIO datasets and LIO pipelines. This allows for easy addition of new datasets and pipelines, as well as a common location to evaluate them making benchmarks significantly easier to run. It features,

  • No ROS dependency! (though it can still load rosbag datasets using the wonderful rosbags package)
  • Easy to add new datasets and pipelines, see the example
  • Unified representation of lidar scan, e.g. row (scan-line) major order, stamped at the start of the scan, point stamps are relative from the start of the scan.
  • Download and manage datasets via the CLI interface
  • Simple to use API for friction-free access to data
  • Run pipelines via the CLI interface and yaml config files
  • Compute statistics for resulting trajectory runs

Installation

evalio is available on PyPi, so simply install via your favorite python package manager,

uv add evalio      # uv
pip install evalio # pip

Usage

evalio can be used both as a python library and as a CLI for both datasets and pipelines.

Datasets

Once evalio is installed, datasets can be listed and downloaded via the CLI interface. For example, to list all datasets and then download a sequence from the hilti-2022 dataset,

evalio ls datasets
evalio download hilti_2022/basement_2

evalio downloads data to the EVALIO_DATA environment variable, or if unset to the local folder ./evalio_data. All the trajectories in a dataset can also be downloaded by using the wildcard hilti_2022/*, making sure to escape the asterisk as needed.

[!NOTE] Many datasets use gdown to download datasets from google drive. Unfortunately, this can occasionally be finicky due to google's download limits, however downloading cookies from your browser can often help.

Once downloaded, a trajectory can then be easily used in python,

from evalio.datasets import Hilti2022

# for all data
for mm in Hilti2022.basement_2:
    print(mm)

# for lidars
for scan in Hilti2022.basement_2.lidar():
    print(scan)

# for imu
for imu in Hilti2022.basement_2.imu():
    print(imu)

For example, you can easily get a single scan to plot a bird-eye view,

import matplotlib.pyplot as plt
import numpy as np

# get the 10th scan
scan = Hilti2022.basement_2.get_one_lidar(10)
# always in row-major order, with stamp at start of scan
x = np.array([p.x for p in scan.points])
y = np.array([p.y for p in scan.points])
z = np.array([p.z for p in scan.points])
plt.scatter(x, y, c=z, s=1)
plt.axis('equal')
plt.show()

evalio also comes with a built wrapper for converting to rerun types,

import rerun as rr
from evalio.rerun import convert

rr.init("evalio")
rr.connect_tcp()
for scan in Hilti2022.basement_2.lidar():
    rr.set_time_seconds("timeline", seconds=scan.stamp.to_sec())
    rr.log("lidar", convert(scan, color=[255, 0, 255]))

[!NOTE]
To run the rerun visualization, rerun must be installed. This can be done by installing rerun-sdk or evalio[vis] from PyPi.

We recommend checking out the base dataset class for more information on how to interact with datasets.

Pipelines

The other half of evalio is the pipelines that can be run on various datasets. All pipelines and their parameters can be shown via,

evalio ls pipelines

For example, to run KissICP on a dataset,

evalio run -o results -d hilti_2022/basement_2 -p kiss

This will run the pipeline on the dataset and save the results to the results folder. The results can then be used to compute statistics on the trajectory,

evalio stats results

[!NOTE]
KissICP does poorly by default on hilti_2022/basement_2, due to the close range and large default voxel size. You can visualize this by adding -vvv to the run command to visualize the trajectory in rerun.

More complex experiments can be run, including varying pipeline parameters, via specifying a config file,

output_dir: ./results/

datasets:
  # Run on all of newer college trajectories
  - hilti_2022/*
  # Run on first 1000 scans of multi campus
  - name: multi_campus/ntu_day_01
    length: 1000

pipelines:
  # Run vanilla kiss with default parameters
  - kiss
  # Tweak kiss parameters
  - name: kiss_tweaked
    pipeline: kiss
    deskew: true
    # Some of these datasets need smaller voxel sizes
    sweep:
      voxel_size: [0.1, 0.5, 1.0]
      

This can then be run via

evalio run -c config.yml

That's about the gist of it! Try playing around the CLI interface to see what else is possible, such as a number of visualization options using rerun. Feel free to open an issue if you have any questions, suggestions, or problems.

It should also be mentioned, autocomplete can be installed via argcomplete,

evalio "$(register-python-argcomplete evalio)"

This is extra useful for specifying the datasets when downloading or running, as they can get particularly long.

Custom Datasets & Pipelines

We understand that using an internal or work-in-progress datasets and pipelines will often be needed, thus evalio has full support for this. As mentioned above, we recommend checking out our example for more information how to to do this (it's pretty easy!).

The TL;DR version, a custom dataset can be made via inheriting from the Dataset class in python only, and a custom pipeline from inheriting the Pipeline class in either C++ or python. These can then be made available to evalio via the EVALIO_CUSTOM env variable point to the python module that contains them.

We highly recommend making a PR to merge your custom datasets or pipelines into evalio once they are ready. This will make it more likely the community will use and cite your work, as well as increase the usefulness of evalio for everyone.

Building from Source

While we recommend simply installing the python package using your preferred python package manager (our is uv), we've attempted to make building from source as easy as possible. We generally build through scikit-core-build which provides a simple wrapper for building CMake projects as python packages. uv is our frontend of choice for this process, but it is also possible via pip

uv sync          # uv version
pip install -e . # pip version

Of course, building via the usual CMake way is also possible, with the only default dependency being Eigen3,

mkdir build
cd build
cmake ..
make

By default, all pipelines are not included due to their large dependencies. CMake will look for them in the cpp/bindings/pipelines-src directory. If you'd like to add them, simply run the clone_pipelines.sh script that will clone and patch them appropriately.

When these pipelines are included, the number of dependencies increases significantly, so have provided a docker image that includes all dependencies for building as well as a VSCode devcontainer configuration. When opening in VSCode, you'll automatically be prompted to open in this container.

Contributing

Contributions are always welcome! Feel free to open an issue, pull request, etc. We're happy to help you get started. The following are rough instructions for specifically adding additional datasets or pipelines.

Datasets

Datasets are easy to add, simply drop your file into the python/evalio/datasets folder, and add it into the init file.

Pipelines

If adding in a python pipeline, it's near identical to adding a dataset. Drop your file into the python/evalio/pipelines folder, and add it into the init file.

C++ pipelines are more involved (and probably worth the effort). Your header file belongs in the cpp/bindings/pipelines folder. To get it to build, make sure it's added to clone_pipelines.sh, the proper CMakeLists.txt, and the [bindings.h] header. Finally, make sure all dependencies are also added to the docker build script, found in the docker folder.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

evalio-0.1.1-cp313-cp313-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

evalio-0.1.1-cp313-cp313-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.13macOS 11.0+ x86-64

evalio-0.1.1-cp313-cp313-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

evalio-0.1.1-cp312-cp312-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

evalio-0.1.1-cp312-cp312-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.12macOS 11.0+ x86-64

evalio-0.1.1-cp312-cp312-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

evalio-0.1.1-cp311-cp311-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

evalio-0.1.1-cp311-cp311-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.11macOS 11.0+ x86-64

evalio-0.1.1-cp311-cp311-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

File details

Details for the file evalio-0.1.1-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d9094cf05e2b429a265a182615b0159e3774c18af05e97a07e6e1d36345c4005
MD5 e2814e85b9df6c2d43490ed71f89c474
BLAKE2b-256 d905e6de94d8c3c92a75104f37c73a2579d0efb83bbbe5f59e759b314bf09d8f

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp313-cp313-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp313-cp313-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 aba95d727010f6826c7611f54b0a59f42371c12019be4f15088e5f9b7fcb601e
MD5 719843aa022c2716f4508f7606ff723b
BLAKE2b-256 1cf9917836845d2a9593f83caf00c34173da6089d4fed8683a7ce8f4f482baa7

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp313-cp313-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e6cf9570968a104f9749f6df7b22096cc718866572fb918b41012862bab329a7
MD5 20c6de2ef17382cc8b88a1fbfbfa3138
BLAKE2b-256 4faca7e3b184d9c518392f84062e11609da07c20c7ec527dafa09282c66ecc0b

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 88e2ab60705dc9f15f7b7e4f15b56bbaf8336f95bfab9deb49eb6c66244f52bd
MD5 b0d13691c7a6c4fd611a5d4c48485a5b
BLAKE2b-256 9bfeaf2c7889297722c58b70f47916ba5afe06e86cd1eee3c79637472c676730

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 4dbd5c22c2a06853779ae53f1bcde9ec5a1c8d9086528ca3e2b430732fb8bf63
MD5 2f9b913744e1d2e23192962942de772d
BLAKE2b-256 625c56e2ed8be4bfd2245285816869cd3319795776d8b5c7db00a73f93be94ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp312-cp312-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2388c53e8fce09f302431d8192cf039fa3ece08a65c0c05f048c09259e4a81a0
MD5 eb6a29fd4733b555fad50c51559037b8
BLAKE2b-256 045be61084b54f3017b59b33817be143c63f26a45cb6f09d523fb7c893142427

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 683479d8317260ac8634469e4c2673bea18f2d237a88191dc975f807d6f2f5e3
MD5 3af5c34b97932c6a6494620f33b1125c
BLAKE2b-256 122d36eeb192d9396651664f30f2bbc3fbfde3d987465535cce14b87c7547c18

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp311-cp311-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp311-cp311-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 bf320e7d5df8fed04d90339fa86f2b50b4eff12493a450d5bb0c3e3317e81ba9
MD5 d676aa75c5e6bc63c476daeae43ef700
BLAKE2b-256 2d3be4b889a46decebbd9aaa645005fcd17585e7445d0c842d93134182532d3b

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp311-cp311-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.1.1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.1.1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6f5f4643ed96f62b04181fe5a05eb4e3a9627c7aa79bda4221d711265bb3fc73
MD5 9129c7a0e90411f375c0de5d7c21f368
BLAKE2b-256 8862f53d27c72193b266ce4fd1080cdc85218339200c7139a3f31bcd8915bc0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.1.1-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page