Skip to main content

Evaluate Lidar-Inertial Odometry on public datasets

Project description

evalio

evalio is a tool for Evaluating Lidar-Inertial Odometry.

Specifically, it provides a common interface for connecting LIO datasets and LIO pipelines. This allows for easy addition of new datasets and pipelines, as well as a common location to evaluate them making benchmarks significantly easier to run. It features,

  • No ROS dependency! (though it can still load rosbag datasets using the wonderful rosbags package)
  • Easy to add new datasets and pipelines, see the example
  • Unified representation of lidar scan, e.g. row (scan-line) major order, stamped at the start of the scan, point stamps are relative from the start of the scan.
  • Download and manage datasets via the CLI interface
  • Simple to use API for friction-free access to data
  • Run pipelines via the CLI interface and yaml config files
  • Compute statistics for resulting trajectory runs

Installation

evalio is available on PyPi, so simply install via your favorite python package manager,

uv add evalio      # uv
pip install evalio # pip

Usage

evalio can be used both as a python library and as a CLI for both datasets and pipelines.

Datasets

Once evalio is installed, datasets can be listed and downloaded via the CLI interface. For example, to list all datasets and then download a sequence from the hilti-2022 dataset,

evalio ls datasets
evalio download hilti_2022/basement_2

evalio downloads data to the EVALIO_DATA environment variable, or if unset to the local folder ./evalio_data. All the trajectories in a dataset can also be downloaded by using the wildcard hilti_2022/*, making sure to escape the asterisk as needed.

[!TIP] evalio also comes with autocomplete, which makes typing the long dataset and pipeline names much easier. To install, do one of the following,

eval "$(evalio --show-completion)" # install for the current session
evalio --install-completion        # install for all future sessions

[!NOTE] Many datasets use gdown to download datasets from google drive. Unfortunately, this can occasionally be finicky due to google's download limits, however downloading cookies from your browser can often help.

Once downloaded, a trajectory can then be easily used in python,

from evalio.datasets import Hilti2022

# for all data
for mm in Hilti2022.basement_2:
    print(mm)

# for lidars
for scan in Hilti2022.basement_2.lidar():
    print(scan)

# for imu
for imu in Hilti2022.basement_2.imu():
    print(imu)

For example, you can easily get a single scan to plot a bird-eye view,

import matplotlib.pyplot as plt
import numpy as np

# get the 10th scan
scan = Hilti2022.basement_2.get_one_lidar(10)
# always in row-major order, with stamp at start of scan
x = np.array([p.x for p in scan.points])
y = np.array([p.y for p in scan.points])
z = np.array([p.z for p in scan.points])
plt.scatter(x, y, c=z, s=1)
plt.axis('equal')
plt.show()

evalio also comes with a built wrapper for converting to rerun types,

import rerun as rr
from evalio.rerun import convert

rr.init("evalio")
rr.connect_tcp()
for scan in Hilti2022.basement_2.lidar():
    rr.set_time_seconds("timeline", seconds=scan.stamp.to_sec())
    rr.log("lidar", convert(scan, color=[255, 0, 255]))

[!NOTE]
To run the rerun visualization, rerun must be installed. This can be done by installing rerun-sdk or evalio[vis] from PyPi.

We recommend checking out the base dataset class for more information on how to interact with datasets.

Pipelines

The other half of evalio is the pipelines that can be run on various datasets. All pipelines and their parameters can be shown via,

evalio ls pipelines

For example, to run KissICP on a dataset,

evalio run -o results -d hilti_2022/basement_2 -p kiss

This will run the pipeline on the dataset and save the results to the results folder. The results can then be used to compute statistics on the trajectory,

evalio stats results

[!NOTE]
KissICP does poorly by default on hilti_2022/basement_2, due to the close range and large default voxel size. You can visualize this by adding -vvv to the run command to visualize the trajectory in rerun.

More complex experiments can be run, including varying pipeline parameters, via specifying a config file,

output_dir: ./results/

datasets:
  # Run on all of newer college trajectories
  - hilti_2022/*
  # Run on first 1000 scans of multi campus
  - name: multi_campus/ntu_day_01
    length: 1000

pipelines:
  # Run vanilla kiss with default parameters
  - kiss
  # Tweak kiss parameters
  - name: kiss_tweaked
    pipeline: kiss
    deskew: true
    # Some of these datasets need smaller voxel sizes
    sweep:
      voxel_size: [0.1, 0.5, 1.0]
      

This can then be run via

evalio run -c config.yml

That's about the gist of it! Try playing around the CLI interface to see what else is possible, such as a number of visualization options using rerun. Feel free to open an issue if you have any questions, suggestions, or problems.

Custom Datasets & Pipelines

We understand that using an internal or work-in-progress datasets and pipelines will often be needed, thus evalio has full support for this. As mentioned above, we recommend checking out our example for more information how to to do this (it's pretty easy!).

The TL;DR version, a custom dataset can be made via inheriting from the Dataset class in python only, and a custom pipeline from inheriting the Pipeline class in either C++ or python. These can then be made available to evalio via the EVALIO_CUSTOM env variable point to the python module that contains them.

We highly recommend making a PR to merge your custom datasets or pipelines into evalio once they are ready. This will make it more likely the community will use and cite your work, as well as increase the usefulness of evalio for everyone.

Building from Source

While we recommend simply installing the python package using your preferred python package manager (our is uv), we've attempted to make building from source as easy as possible. We generally build through scikit-core-build which provides a simple wrapper for building CMake projects as python packages. uv is our frontend of choice for this process, but it is also possible via pip

uv sync          # uv version
pip install -e . # pip version

Of course, building via the usual CMake way is also possible, with the only default dependency being Eigen3,

mkdir build
cd build
cmake ..
make

By default, all pipelines are not included due to their large dependencies. CMake will look for them in the cpp/bindings/pipelines-src directory. If you'd like to add them, simply run the clone_pipelines.sh script that will clone and patch them appropriately.

When these pipelines are included, the number of dependencies increases significantly, so have provided a docker image that includes all dependencies for building as well as a VSCode devcontainer configuration. When opening in VSCode, you'll automatically be prompted to open in this container.

Contributing

Contributions are always welcome! Feel free to open an issue, pull request, etc. We're happy to help you get started. The following are rough instructions for specifically adding additional datasets or pipelines.

Datasets

Datasets are easy to add, simply drop your file into the python/evalio/datasets folder, and add it into the init file.

Pipelines

If adding in a python pipeline, it's near identical to adding a dataset. Drop your file into the python/evalio/pipelines folder, and add it into the init file.

C++ pipelines are more involved (and probably worth the effort). Your header file belongs in the cpp/bindings/pipelines folder. To get it to build, make sure it's added to clone_pipelines.sh, the proper CMakeLists.txt, and the [bindings.h] header. Finally, make sure all dependencies are also added to the docker build script, found in the docker folder.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

evalio-0.2.0-cp313-cp313-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

evalio-0.2.0-cp313-cp313-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.13macOS 11.0+ x86-64

evalio-0.2.0-cp313-cp313-macosx_11_0_arm64.whl (2.1 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

evalio-0.2.0-cp312-cp312-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

evalio-0.2.0-cp312-cp312-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.12macOS 11.0+ x86-64

evalio-0.2.0-cp312-cp312-macosx_11_0_arm64.whl (2.1 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

evalio-0.2.0-cp311-cp311-manylinux_2_28_x86_64.whl (3.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

evalio-0.2.0-cp311-cp311-macosx_11_0_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.11macOS 11.0+ x86-64

evalio-0.2.0-cp311-cp311-macosx_11_0_arm64.whl (2.1 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

File details

Details for the file evalio-0.2.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c9038875d61ce6c348aee0a83426219d2b87c4a306bbee534723ba0bdff6329c
MD5 41508f42c65824b299ecbc87030c9494
BLAKE2b-256 29c24a9019fa559663921ff287d76c1d5e41ca545f8f9cb277354adb498fd680

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp313-cp313-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp313-cp313-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 3f8fedfd9d09279b89af8ba86c4fe5a951b4bb573314b831726b52a29527fe8d
MD5 2f96c8cd80bf6da3048999c8b566ef99
BLAKE2b-256 6bce59bafc971120c0356e270cab106a940e4194bd05068b133f37eb028385ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp313-cp313-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 764a5d3cadb294e416970ff70c0f4c1bc9d6970c4fcb3502e3b1677c8281c180
MD5 8d87429750e3cbc3ecf518846c8443eb
BLAKE2b-256 71dafc228dff85d458c7ece68e79e384dc45aac99dec034549ed425c0bac9963

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 889d15878022bffb1cd2f050dae11a2097b3cf00b6b8d1079ac58baa1d6626c5
MD5 b87d21db6e2a034ef3f3e3ed0873275f
BLAKE2b-256 5bfe0d60b7c5b7fd26b63f5efd1765d852e70e069139ffbc4135477cf7f09d2a

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 e7aaf88f286a7124ff38750d68b75d5b9eed647c6ff370690909360872aaa3cc
MD5 93217662b61ea4b6339d0d0dab40a7ac
BLAKE2b-256 0a488869c32384816a28b575205efa278b88516d1a25a2949939e8ece1de656e

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp312-cp312-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3ed64314f94f1984fc14e8d5bbfe0b19e6f097b1fee9835512ef50620b475f1b
MD5 f45bd0f4400929fcb88ffe773babf0af
BLAKE2b-256 e9fb4c1a3db3768eb36f4459008b97d1a93e83334f4f39eae307f5f19361bced

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 09949e0b705c0d434a0de1a78a94471ed4d0aab3341ff2cf45b7e3aa8f1cb465
MD5 d9d11e8569de2eea72f0ca50fb613ca3
BLAKE2b-256 ffe5e571dbecf2a97df5d2e4ab5a7f241a56dbb1e721fdd4c8aa9ab03c6368bd

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp311-cp311-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp311-cp311-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 1366cb4316c3c210a003bc43a76aa6a1c11799a41e16c7e9aad1f0b357dd6f48
MD5 d5c4beeb7ab727846ce83b6f65d1783c
BLAKE2b-256 06ba11f75f68f1040d2ef2a7b314b7d8f54d54f0db3a87c568f80c7d32116a07

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp311-cp311-macosx_11_0_x86_64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file evalio-0.2.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for evalio-0.2.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1a43431ab87d5de181faef21ed6932f245c6e3c325658388c2126fa85f3d537c
MD5 ad1dd46dff5c3dfb537aa14503d4d0be
BLAKE2b-256 ec7616bc1bea50688f1e1770e497f4f163e714e3659d8e829e1984d487e7c83b

See more details on using hashes here.

Provenance

The following attestation bundles were made for evalio-0.2.0-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on contagon/evalio

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page