Skip to main content

A performant Structure from Motion library for Python

Project description

pyTheia - A Python Structure-from-Motion and Geometric Vision Swiss Knife

pyTheia is based on TheiaSfM. It contains Python bindings for most of the functionalities of TheiaSfM and more.

The library is still in active development and the interfaces are not yet all fixed

With pyTheia you have access to a variety of different camera models, structure-from-motion pipelines and geometric vision algorithms.

Differences to the original library TheiaSfM

pyTheia does not aim at being an end-to-end SfM library. For example, building robust feature detection and matching pipelines is usually application and data specific (e.g. image resolution, runtime, pose priors, invariances, ...). This includes image pre- and postprocessing.

pyTheia is rather a "swiss knife" for quickly prototyping SfM related reconstruction applications without sacrificing perfomance. For example SOTA feature detection & matching, place recognition algorithms are based on deep learning, and easily usable from Python. However, using these algorithms from a C++ library is not always straighforward and especially quick testing and prototyping is cumbersome.

Dependency changes

Compared to the original TheiaSfM:

  • SuiteSparse: Optional for Ceres; GPL-dependent code was removed in src/math/matrix/sparse_cholesky_llt.cc (cholmod -> Eigen::SimplicialLDLT), which may be slower for very large problems and slightly less stable numerically.
  • RapidJSON: No separate dependency; RapidJSON is vendored via cereal headers.
  • RocksDB: Feature/match database remains available but is optional behind the WITH_ROCKSDB CMake option.
  • OpenImageIO: Still used internally for image I/O.

Changes to the original TheiaSfM library

  • Global SfM algorithms:
    • LiGT position solver
    • Lagrange Dual rotation estimator
    • Hybrid rotation estimator
    • Possibility to fix multiple views in Robust_L1L2 solver
    • Nonlinear translation solver can fix multiple view or estimate all remaining views in reconstruction
  • Camera models
    • Double Sphere
    • Extended Unified
    • Orthographic
  • Bundle adjustment
    • Using a homogeneous representation for scene points
    • Extracting covariance information
    • Possibility to add a depth prior to 3D points
    • Position prior for camera poses (e.g. for GPS or known positions)
  • General
    • Added timestamp, position_prior_, position_prior_sqrt_information_ variables to View class Eigen::Matrix3d position_prior_sqrt_information_;
    • Added inverse_depth_, reference_descriptor, reference_bearing_ variables to Track class
    • Added covariance_, depth_prior_, depth_prior_variance_ to Feature class
  • Absolute Pose solvers
    • SQPnP
    • UncalibratedPlanarOrthographic Pose

Usage Examples

Full reconstruction example: Global, Hybrid or Incremental SfM using OpenCV feature detection and matching

Have a look at the short example: sfm_pipeline.py. Download the south_building dataset from here. Extract it somewhere and run:

python pytests/sfm_pipeline.py --image_path /path/to/south-building/images/

Creating a camera

The following example show you how to create a camera in pyTheia. You can construct it from a pt.sfm.CameraIntrinsicsPrior() or set all parameters using respective functions from pt.sfm.Camera() class.

import pytheia as pt
prior = pt.sfm.CameraIntrinsicsPrior()
prior.focal_length.value = [1000.]
prior.aspect_ratio.value = [1.]
prior.principal_point.value = [500., 500.]
prior.radial_distortion.value = [0., 0., 0., 0]
prior.tangential_distortion.value = [0., 0.]
prior.skew.value = [0]
prior.camera_intrinsics_model_type = 'PINHOLE' 
#'PINHOLE', 'DOUBLE_SPHERE', 'EXTENDED_UNIFIED', 'FISHEYE', 'FOV', 'DIVISION_UNDISTORTION'
camera = pt.sfm.Camera()
camera.SetFromCameraIntrinsicsPriors(prior)

# the camera object also carries extrinsics information
camera.SetPosition([0,0,-2])
camera.SetOrientationFromAngleAxis([0,0,0.1])

# project with intrinsics image to camera coordinates
camera_intrinsics = camera.CameraIntrinsics()
pt2 = [100.,100.]
pt3 = camera_intrinsics.ImageToCameraCoordinates(pt2)
pt2 = camera_intrinsics.CameraToImageCoordinates(pt3)

# project with camera extrinsics
pt3_h = [1,1,2,1] # homogeneous 3d point
depth, pt2 = camera.ProjectPoint(pt3_h)
# get a ray from camera to 3d point in the world frame
ray = camera.PixelToUnitDepthRay(pt2)
pt3_h_ = ray*depth + camera.GetPosition() # == pt3_h[:3]

Solve for absolute or relative camera pose

pyTheia integrates a lot of performant geometric vision algorithms. Have a look at the tests

import pytheia as pt

# absolute pose
pose = pt.sfm.PoseFromThreePoints(pts2D, pts3D) # Kneip
pose = pt.sfm.FourPointsPoseFocalLengthRadialDistortion(pts2D, pts3D)
pose = pt.sfm.FourPointPoseAndFocalLength(pts2D, pts3D)
pose = pt.sfm.DlsPnp(pts2D, pts3D)
... and more

# relative pose
pose = pt.sfm.NormalizedEightPointFundamentalMatrix(pts2D, pts2D)
pose = pt.sfm.FourPointHomography(pts2D, pts2D)
pose = pt.sfm.FivePointRelativePose(pts2D, pts2D)
pose = pt.sfm.SevenPointFundamentalMatrix(pts2D, pts2D)
... and more

# ransac estimation
params = pt.solvers.RansacParameters()
params.error_thresh = 0.1
params.max_iterations = 100
params.failure_probability = 0.01

# absolute pose ransac
correspondences2D3D = pt.matching.FeatureCorrespondence2D3D(
  pt.sfm.Feature(point1), pt.sfm.Feature(point2))

pnp_type =  pt.sfm.PnPType.DLS #  pt.sfm.PnPType.SQPnP,  pt.sfm.PnPType.KNEIP
success, abs_ori, summary = pt.sfm.EstimateCalibratedAbsolutePose(
  params, pt.sfm.RansacType(0), pnp_type, correspondences2D3D)

success, abs_ori, summary = pt.sfm.EstimateAbsolutePoseWithKnownOrientation(
  params, pt.sfm.RansacType(0), correspondences2D3D)
... and more
# relative pose ransac
correspondences2D2D = pt.matching.FeatureCorrespondence(
            pt.sfm.Feature(point1), pt.sfm.Feature(point2))

success, rel_ori, summary = pt.sfm.EstimateRelativePose(
        params, pt.sfm.RansacType(0), correspondences2D2D)

success, rad_homog, summary = pt.sfm.EstimateRadialHomographyMatrix(
        params, pt.sfm.RansacType(0), correspondences2D2D)  

success, rad_homog, summary = pt.sfm.EstimateFundamentalMatrix(
        params, pt.sfm.RansacType(0), correspondences2D2D)  
... and more

Bundle Adjustment of views or points

import pytheia as pt
recon = pt.sfm.Reconstruction()
# add some views and points
veiw_id = recon.AddView() 
...
track_id = recon.AddTrack()
...
covariance = np.eye(2) * 0.5**2
point = [200,200]
recon.AddObservation(track_id, view_id, pt.sfm.Feature(point, covariance))

# robust BA
opts = pt.sfm.BundleAdjustmentOptions()
opts.robust_loss_width = 1.345
opts.loss_function_type = pt.sfm.LossFunctionType.HUBER

res = BundleAdjustReconstruction(opts, recon)
res = BundleAdjustPartialReconstruction(opts, {view_ids}, {track_ids}, recon)
res = BundleAdjustPartialViewsConstant(opts, {var_view_ids}, {const_view_ids}, recon)

# optimize absolute pose on normalized 2D 3D correspondences
res = pt.sfm.OptimizeAbsolutePoseOnNormFeatures(
  [pt.sfm.FeatureCorrespondence2D3D], R_init, p_init, opts)

# bundle camera adjust pose only
res = BundleAdjustView(recon, opts, view_id)
res = BundleAdjustViewWithCov(recon, view_id)
res = BundleAdjustViewsWithCov(recon, opts, [view_id1,view_id2])

# optimize structure only
res = BundleAdjustTrack(recon, opts, trackid)
res = BundleAdjustTrackWithCov(recon, opts, [view_id1,view_id2])
res = BundleAdjustTracksWithCov(recon, opts, [view_id1,trackid])

# two view optimization
res = BundleAdjustTwoViewsAngular(recon, [pt.sfm.FeatureCorrespondence], pt.sfm.TwoViewInfo())

Export to Nerfstudio and SDFStudio

You can export a pt.sfm.Reconstruction to Nerfstudio or SDFStudio formats directly from Python:

import pytheia as pt
# Nerfstudio (writes transforms.json)
pt.io.WriteNerfStudio("/path/to/images", recon, 16, "/path/to/out/transforms.json")
# SDFStudio (all images must be undistorted)
pt.io.WriteSdfStudio("/path/to/images", recon, (2.0, 6.0), 1.0)

More complete examples are in pyexamples/nerfstudio_export_reconstruction.py and pyexamples/sdfstudio_export_reconstruction.py.

Building

This section describes how to build on Ubuntu locally or on WSL2 both with sudo rights. The basic dependency is:

Installing the ceres-solver will also install the neccessary dependencies for pyTheia:

  • gflags
  • glog
  • Eigen
sudo apt install cmake build-essential 

# cd to your favourite library folder
mkdir LIBS
cd LIBS

# eigen
git clone https://gitlab.com/libeigen/eigen
cd eigen && git checkout 3.4.0
mkdir -p build && cd build && cmake .. && sudo make install

# libgflags libglog libatlas-base-dev
sudo apt install libgflags-dev libgoogle-glog-dev libatlas-base-dev

# ceres solver
cd LIBS
git clone https://ceres-solver.googlesource.com/ceres-solver
cd ceres-solver && git checkout 2.1.0 && mkdir build && cd build
cmake .. -DBUILD_TESTING=OFF -DBUILD_EXAMPLES=OFF -DBUILD_BENCHMARKS=OFF
make -j && make install

Local build without sudo

To build it locally it is best to set the EXPORT_BUILD_DIR flag for the ceres-solver. You will still need sudo apt install libgflags-dev libgoogle-glog-dev libatlas-base-dev. So go ask your admin ;)

# cd to your favourite library folder. The local installation will be all relative to this path!
mkdir /home/LIBS
cd /home/LIBS

# eigen
git clone https://gitlab.com/libeigen/eigen
cd eigen && git checkout 3.4.0
mkdir -p build && cd build && cmake .. -DCMAKE_INSTALL_PREFIX=/home/LIBS/eigen/build && make -j install

cd /home/LIBS
git clone https://ceres-solver.googlesource.com/ceres-solver
cd ceres-solver && git checkout 2.1.0 && mkdir build && cd build
cmake .. -DBUILD_TESTING=OFF -DBUILD_EXAMPLES=OFF -DBUILD_BENCHMARKS=OFF -DEXPORT_BUILD_DIR=ON
make -j

# cd to the pyTheiaSfM folder
cd pyTheiaSfM && mkdir build && cd build 
cmake -DEigen3_DIR=/home/LIBS/eigen/build/share/eigen3/cmake/ .. 
make -j

How to build Python wheels

Local build with sudo installed ceres-solver and Eigen

Tested on Ubuntu. In your Python >= 3.6 environment of choice run:

sh build_and_install.sh

If you have problems like /lib/libstdc++.so.6: version `GLIBCXX_3.4.30' not found on Ubuntu 22.04 in an Anaconda environment try:

conda install -c conda-forge libstdcxx-ng

Another solution is to check the GLIBCXX versions. If the version that the library requires is installed, then we can create a symbolic link into the conda environment.

strings /usr/lib/x86_64-linux-gnu/libstdc++.so.6 | grep GLIBCXX
# if the GLIBCXX version is available then do:
ln -sf /usr/lib/x86_64-linux-gnu/libstdc++.so.6 ${CONDA_PREFIX}/lib/libstdc++.so.6

With Docker

The docker build will actually build manylinux wheels for Linux (Python 3.6-3.12). There are two ways to do that. One will clutter the source directory, but you will have the wheel file directly available (./wheelhouse/). Another drawback of this approach is that the files will have been created with docker sudo rights and are diffcult to delete:

# e.g. for python 3.9
docker run --rm -e PYTHON_VERSION="cp39-cp39" -v `pwd`:/home urbste/pytheia_base:1.2.0 /home/pypackage/build-wheel-linux.sh

The other one is cleaner but you will have to copy the wheels out of the docker container afterwards:

docker build -t pytheia:1.0 .
docker run -it pytheia:1.0

Then all the wheels will be inside the container in the folder /home/wheelhouse. Open a second terminal and run

docker ps # this will give you a list of running containers to find the correct CONTAINER_ID
docker cp CONTAINER_ID:/home/wheelhouse /path/to/result/folder/pytheia_wheels

Typing and editor stubs

To get full function/argument lists and IntelliSense in editors for the native extension:

  • Generate stubs locally (requires pybind11-stubgen):

    pip install pybind11-stubgen
    dev/generate_stubs.sh
    

    This writes .pyi files to typings/pytheia. VS Code/Pylance will pick them up via pyrightconfig.json.

  • When building wheels via setup.py, stubs are generated automatically by default. To skip:

    GENERATE_STUBS=0 python setup.py bdist_wheel --plat-name=...
    
  • The package ships a PEP 561 marker (py.typed) so downstream type checkers can consume the bundled stubs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pytheia-0.4.4-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

pytheia-0.4.4-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

pytheia-0.4.4-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

pytheia-0.4.4-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

pytheia-0.4.4-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

pytheia-0.4.4-cp38-cp38-manylinux_2_28_x86_64.whl (9.3 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ x86-64

File details

Details for the file pytheia-0.4.4-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8af893eb866a737d748ee6e13f64b045fc4d29a754cebab6a208d797d49ce803
MD5 867a83e23ead8d3f73cae20e2eacb39f
BLAKE2b-256 fa31aa98d7b1e750def20c18d1cc721e42941a9ae6426dc5a9003d1bd5eae440

See more details on using hashes here.

File details

Details for the file pytheia-0.4.4-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 691bd99193eb914cd81d1867b8b9041e5c0f49342216f463be007c98f8a12b41
MD5 fb70918df05369b84279f1078e772a1f
BLAKE2b-256 90f4baa489492a5d183fe3f0b44d0336ed14d9af732418ad37cf06230131108e

See more details on using hashes here.

File details

Details for the file pytheia-0.4.4-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 48bfa8b2b18265cfcd3cfa4faf60d9522dacfe04dab6edeee7673d126e254cad
MD5 3e0aa04b0640fd730e86c0051ec87f76
BLAKE2b-256 c505b9109b8807a93e65fc96c81caebf3945a8257923a6edfd0b46520dc8a4f0

See more details on using hashes here.

File details

Details for the file pytheia-0.4.4-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 1c1b539270a03317b37dac21782af5c8395ffc1481bab7aa0b5614ef1c6a3526
MD5 d3d563d948bedecb23057a0fc88c6e27
BLAKE2b-256 1301c0a156b97c3e435148da1e691b15e17885891df4b9155cba07de690aab53

See more details on using hashes here.

File details

Details for the file pytheia-0.4.4-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 769b3dfa92274e57be2d3c939890321ca5621bb7d29bdc20156adcaf4337ad1c
MD5 3a805fda5720b69e18ec26d81a8806e3
BLAKE2b-256 e1958ac983d08c1f50763e81c9a5767de34da2cd1a26b70cec27c7ae5479d724

See more details on using hashes here.

File details

Details for the file pytheia-0.4.4-cp38-cp38-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for pytheia-0.4.4-cp38-cp38-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 4f7298b42a058b31c62f9c9761153db5d91e77753a4f69a66e6c0cc99154814b
MD5 fd7b92edae946876b57fdf343afe45db
BLAKE2b-256 8e8aeb7cca3cbb5524eecbe6861600f56bb8a8657f126f94fff94350b18f6f38

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page