Skip to main content

OccPy: a python tool to map occluded area from LiDAR data in 3D using a voxel traversal algorithm.

Project description

Occpy


OccPy is a python tool to map occluded area from LiDAR data in 3D using a voxel traversal algorithm implemented in C++.

Installation

Via pip: (TODO: replace with pypi version)

 pip install -i https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple occpy_ls==0.1

Pre-built wheels are available for Python versions 3.10, 3.11, 3.12, 3.13, on:

  • Linux (x86_64)
  • Windows
  • MacOS A source distribution is also available, which will require a C++ environment with boost libraries installed for a working installation.

Note: Proprietary RIEGL libraries are not packaged, to use RXP and RDBX files, you must build from source.

Build from source

Clone the repository

git clone https://github.com/dkueken/OccPy.git
cd OccPy

Set up the environment

conda env create -f environment.yml
conda activate occPy

Build extensions and install occpy using:

pip install -v .

NOTE: if you want to use RIEGL .rdbx and .rxp files as input, make sure to set the RIVLIB_ROOT and RDBLIB_ROOT environment variables to the root path of the corresponding libraries before installing.

Usage

The behavior of OccPy can be configured using JSON setting files (see e.g. settings_MLS_tutorial.JSON as an example). While this is not strictly necessary, we still recommend setting up such setting files for running OccPy for later reference on the settings.

First we import OccPy into your python (or jupyter) script:

from occpy.OccPy import OccPy  # this loads the OccPy class with the core functionality
from occpy.util import normalize_occlusion_output   # within occpy.util multiple additional utility functions can be loaded, e.g. to normalize occlusion output

Afterwards we initiate an OccPy object for voxel traversal like the following:

test = OccPy(laz_in="path/to/laz_file.laz",
             out_dir="path/to/output_dir",
             vox_dim=0.1,                   # voxel dimesnions in m (for the moment only cubic voxels are allowed)
             lower_threshold=1,             # lower threshold in meters, to reduce effects caused by terrain (only actually functional if Terrain Model is provided)
             points_per_iter=10000000,      # number of points to be loaded at once. Note, this is only active if point cloud is sorted along gps_time or it is single return. Otherwise entire dataset needs to be loaded at once
             plot_dim=[min_x, min_y, min_z, max_x, max_y, max_z] # Corner coordinates of the voxel grid. Note only integer values are currently supported here (do not define sub-meter corner coordiantes for the moment!)
             )

In the next step we define the sensor position, either by defining the scanner position or by providing trajectory information for mobile acquisitions. In this example we show the sensor position definition based on a handheld MLS acquistion using a GeoSLAM ZebHorizon processed using FARO Connect processing facilities.

test.define_sensor_pos(path2file="path/to/trajectory_file.txt", 
                       is_mobile=True,              # whether acquisition is mobile. Always true for MLS or ULS
                       single_return=True,          # whether the data is single or multi return
                       delimiter=" ",               # delimiter used in the trajectory file
                       hdr_time='//world_time',     # column header for the time information in the trajectory file
                       hdr_x='x',                   # column header for the x coordiante in the trajectory file
                       hdr_y='y',                   # column header for the y coordinate in the trajectory file
                       hdr_z='z'                    # column header for the z coordinate in the trajectory file
                       )

Note, sensor position definition differs slightly when using TLS scans. See TLS jupyter notbooks for more information on this.

Once OccPy is fully parameterized, we can run it by simply calling

test.do_raytracing()

This will store four output files as .npy files into the defined output directory: Nhit.npy, Nmiss.npy, Nocc.npy and Classification.npy These can be loaded into you python script like:

import numpy as np
Nhit = np.load("output_dir/Nhit.npy")
Nmiss = np.load("output_dir/Nmiss.npy")
Nocc = np.load("output_dir/Nocc.npy")
Classification + np.load("output_dir/Classification.npy")
  • Nhit.npy: 3D numpy array with the number of laser hits per voxel
  • Nmiss.npy: 3D numpy array with the number of misses (i.e. pulses that have no laser return in the specific voxel, but last return has not yet been reached)
  • Nocc.npy: 3D numpy array witht the number of occluded pulses (i.e. the number of pulses that have already reached the last return before traversing the specific voxel)
  • Classification.npy: 3D numpy array stating the classifciation for each voxel into Hit, Empty, Occluded and Unobserved, with the class definition like:
    • 1: Observed voxel with at least on registered return
    • 2: Empty voxel which was observed by at least one pulse that has not yet reached its last return
    • 3: occluded voxel, where all pulses traversing it were occluded
    • 4: unobserved voxel, where no pulse was traversed through it.

The classification into these classes is performed on the python side by using the following code:

Classification[np.logical_and.reduce((Nhit > 0, Nmiss >= 0, Nocc >= 0))] = 1  # voxels that were observed
Classification[np.logical_and.reduce((Nhit == 0, Nmiss > 0, Nocc >= 0))] = 2  # voxels that are empty
Classification[np.logical_and.reduce((Nhit == 0, Nmiss == 0, Nocc > 0))] = 3  # voxels that are hidden (occluded)
Classification[np.logical_and.reduce((Nhit == 0, Nmiss == 0, Nocc == 0))] = 4  # voxels that were not observed # 

Note, this Classification grid follows a binary definition of occlusion, i.e. a voxel is labelled as occluded, only if no pulse was labelled as miss or return in the specific voxel. If you prefer to define your own threshold for occlusion, or you would like to assess fractional occlusion, you could calculate the occlusion fraction per voxel like this:

occl_frac = Nocc.astype(float) / (Nhit.astype(float) + Nmiss.astype(float) + Nocc.astype(float))

If you would like to have the output grids height normalized, this can be perfored using the normalize_occlusion_output function

from occpy.util import normalize_occlusion_output

normalize_occlusion_output(input_folder='path/to/occpy_output_dir',
                           PlotDim=plot_dim,           # Plot dimensions, i.e. corner coordinates like [min_x, min_y, min_z, max_x, max_y, max_z]
                           vox_dim=vox_dim,            # Voxel dimesnions in meters
                           dtm_file='path/to/DTM.tif', # path to DTM tif 
                           dsm_file='path/to/DSM.tif', # optional path to DSM
                           lower_threshold=lower_threshold,    # if voxels close to DTM should be ignored
                           output_voxels=False
                           )

For more information and examples, plese see the following example jupytor scripts.

Requirements for a successful occlusion mapping

In order for the occlusion mapping to work, several requirements on the input data have to be met. These are listed below specifically for the different flavors of LiDAR platforms.

TLS

Scan Positions

  • Scan Position file (as txt), where position should be referring to the laser source position. See examples in notebooks.

LAZ File

  • 1 LAZ or LAS file per scan position, preferably not filtered.

If a multi return TLS is used, you can improve performance by sorting the LAZ file according to GPS Time and return number, e.g. by using LASTools's lassort function:

lassort -i in_laz -gps_time -return_number -odix _sort -olaz -cpu64 -v

MLS

Trajectory file

A trajectory file is strongly needed for the algorithm to work with MLS data. The following data should be present in trajectory file:

  • Time (usually GPS time in seconds) - be sure that the GPS time format corresponds to the one stored in the gps_time field of the laz file
  • Position of the sensor in X, Y, Z coordinates

The pose of the sensor is currently not regarded (e.g. quaternions) We are expecting that the coordinates in the trajectory corresponds to the position of the laser source.

The gps time tags do not need to be exactly the same as found in the gps_time field of the laz file, as the exact position will be interpolated based on the gps_time. However, a higher frequency in positional readings of the trajectory file will result in more accurate interpolation of the scanner position and hence a more accurate occlusion map.

LAZ file

As stated before, the biggest requirement for the LAZ file is that gps_time field is corresponding to the gps time readings in the trajectory file.

UAVLS

Trajectory file

As in the case for MLS data, trajectories are a hard requirement for UAVLS data. Please refer to MLS section for the requirements on the trajectory file. Also check out MLS_notebook.ipynb or ULS_notebook.ipynb jupyter notebooks for how to use this tool for occlusion mapping.

LAZ file

As UAVLS data often come as multi-return data, it is again recommended to sort the LAZ file based on gps_time and return_number like:

lassort -i laz_in -gps_time -return_number -odix _sort -olaz -cpu64 -v

Unsorted LAZ files will also work, however, there will be a substantial computational overhead, as the entire dataset needs to be read and stored at once.

Visualization

Check the pyvista and pyvista_interactive notebooks to visualize occlusion map outputs in 3D.

Pyvista demo

Support

For questions and support, please contact Daniel Kükenbrink via daniel.kuekenbrink@wsl.ch

Roadmap

Several open issues and improvements are currently worked on or planned for the future:

  • Add support for reading in a DTM file into the voxel traversal, so the algorithm could stop, once the pulse reached the terrain.
  • Substantial performance improvement by using multi core processing
  • Add functionality for PAI/PAD calculation of each voxel (i.e. calculation of path length within voxel for each pulse)
  • There is currently still an issue with UAVLS data, where some (very few) LiDAR returns are not registered by the algorithm. The implications for that should be analysed and the problem mitigated. This could cause an underestimation of occlusion, as the e.g. the last return is never reached and the pulse will traverse further without declaring an voxels as occluded for that pulse. There is the possibility to overcome this issue by using the function RayTr.doRaytracing_singleReturnPulses(x, y, z, sensor_x, sensor_y, sensor_z, gps_time, return_number, number_of_returns) as used in the script Test_MLS.py, where the input data is not initially converted to a pulse dataset, but each return is basically treated as a single pulse. We would only recommend to use this approach, if you are confident about your trajectory information.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

Authors and acknowledgment

The algorithm is strongly based on the initial publication of a voxel traversal algorithm as seen in Amanatides & Woo (1987). This algorithm has been used in the publication by Kükenbrink et al. (2017) to map occlusion in ALS data and is openly available as a Matlab code here: https://www.eufar.net/documents/6028 (user account needed). Big motivation for the development of this study came from the interesting paper by Bienert et al. (2010). This implementation is a substantial evolution to the Matlab implementation and should now be able to run for any lidar platform available, when requirements as stated in Requirements section are met. Also performance of this Cython implementation should be largely increased compared to the Matlab implementation.

Development of the initial Matlab implementation was performed during the PhD studies of Daniel Kükenbrink at the University of Zurich within the EUFAR JRA - HYLIGHT project (EUFAR2 contract no. 312609). The initial development of the Cython version has started during the same PhD and was used in the study by Schneider et al. (2019) to map occlusion from TLS and UAVLS acquisitions in a temperate and tropical forest. Substantial improvements and further development has been done at the Swiss Federal Institute WSL since then. The development is still ongoing also in the framework of the 3DForEcoTech COST action (working group 1).

Big thank you go out to all contributing to this code base since the beginning of my PhD, Felix Morsdorf, Fabian Schneider, Meinrad Abegg, Ruedi Bösch, Christian Ginzler as well as to those pushing the code base towards the publication of OccPy as a python package: William Albert, Wout Cherlet, Bernhard Höfle, and Jonas Wenk.

Literature

@article{Amanatides1987,
    author = {Amanatides, John and Woo, Andrew},
    year = {1987},
    month = {08},
    pages = {},
    title = {A Fast Voxel Traversal Algorithm for Ray Tracing},
    volume = {87},
    journal = {Proceedings of EuroGraphics}
}
@article{Bienert2010,
    author = {Bienert, Anne and Queck, Ronald and A, A. and Maas, Hans-Gerd},
    year = {2010},
    month = {01},
    pages = {92-97},
    title = {Voxel space analysis of terrestrial laser scans in forests for wind field modelling},
    volume = {XXXVIII, Part 5},
    journal = {International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences}
}
@article{KUKENBRINK2017424,
    title = {Quantification of hidden canopy volume of airborne laser scanning data using a voxel traversal algorithm},
    journal = {Remote Sensing of Environment},
    volume = {194},
    pages = {424-436},
    year = {2017},
    issn = {0034-4257},
    doi = {https://doi.org/10.1016/j.rse.2016.10.023},
    url = {https://www.sciencedirect.com/science/article/pii/S0034425716303959},
    author = {Daniel Kükenbrink and Fabian D. Schneider and Reik Leiterer and Michael E. Schaepman and Felix Morsdorf}
    }
@article{SCHNEIDER2019249,
    title = {Quantifying 3D structure and occlusion in dense tropical and temperate forests using close-range LiDAR},
    journal = {Agricultural and Forest Meteorology},
    volume = {268},
    pages = {249-257},
    year = {2019},
    issn = {0168-1923},
    doi = {https://doi.org/10.1016/j.agrformet.2019.01.033},
    url = {https://www.sciencedirect.com/science/article/pii/S0168192319300267},
    author = {Fabian D. Schneider and Daniel Kükenbrink and Michael E. Schaepman and David S. Schimel and Felix Morsdorf}
    }

How to cite

For now, please cite the following studies

@article{KUKENBRINK2017424,
    title = {Quantification of hidden canopy volume of airborne laser scanning data using a voxel traversal algorithm},
    journal = {Remote Sensing of Environment},
    volume = {194},
    pages = {424-436},
    year = {2017},
    issn = {0034-4257},
    doi = {https://doi.org/10.1016/j.rse.2016.10.023},
    url = {https://www.sciencedirect.com/science/article/pii/S0034425716303959},
    author = {Daniel Kükenbrink and Fabian D. Schneider and Reik Leiterer and Michael E. Schaepman and Felix Morsdorf}
    }
@article{SCHNEIDER2019249,
    title = {Quantifying 3D structure and occlusion in dense tropical and temperate forests using close-range LiDAR},
    journal = {Agricultural and Forest Meteorology},
    volume = {268},
    pages = {249-257},
    year = {2019},
    issn = {0168-1923},
    doi = {https://doi.org/10.1016/j.agrformet.2019.01.033},
    url = {https://www.sciencedirect.com/science/article/pii/S0168192319300267},
    author = {Fabian D. Schneider and Daniel Kükenbrink and Michael E. Schaepman and David S. Schimel and Felix Morsdorf}
    }

License

See LICENSE.

Project status

This tool is still under development and substantial testing with different datasets should be performed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

occpy_ls-0.2.0rc2.tar.gz (169.0 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

occpy_ls-0.2.0rc2-cp313-cp313-win_amd64.whl (117.5 kB view details)

Uploaded CPython 3.13Windows x86-64

occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (927.0 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl (894.6 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.12+ i686manylinux: glibc 2.17+ i686

occpy_ls-0.2.0rc2-cp313-cp313-macosx_11_0_arm64.whl (122.2 kB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

occpy_ls-0.2.0rc2-cp312-cp312-win_amd64.whl (117.8 kB view details)

Uploaded CPython 3.12Windows x86-64

occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (931.3 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl (898.7 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.12+ i686manylinux: glibc 2.17+ i686

occpy_ls-0.2.0rc2-cp312-cp312-macosx_11_0_arm64.whl (122.8 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

occpy_ls-0.2.0rc2-cp311-cp311-win_amd64.whl (117.4 kB view details)

Uploaded CPython 3.11Windows x86-64

occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (914.6 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl (882.4 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.12+ i686manylinux: glibc 2.17+ i686

occpy_ls-0.2.0rc2-cp311-cp311-macosx_11_0_arm64.whl (122.4 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

occpy_ls-0.2.0rc2-cp310-cp310-win_amd64.whl (117.2 kB view details)

Uploaded CPython 3.10Windows x86-64

occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (901.6 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl (873.1 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.12+ i686manylinux: glibc 2.17+ i686

occpy_ls-0.2.0rc2-cp310-cp310-macosx_11_0_arm64.whl (122.4 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

File details

Details for the file occpy_ls-0.2.0rc2.tar.gz.

File metadata

  • Download URL: occpy_ls-0.2.0rc2.tar.gz
  • Upload date:
  • Size: 169.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for occpy_ls-0.2.0rc2.tar.gz
Algorithm Hash digest
SHA256 779c0b2503d085214f02b952081a6086763cf4495451f8bb6e3949856ff3fdea
MD5 5c6e11a0cfce8a762b12d8ad1880141d
BLAKE2b-256 ecf6edb42a7abe4c334292f7816a5c988cf5f665c8f28f87691f62fd108f3b58

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 f81733b96f30016ee17d8d58b9d5b07a735386f735516c500f7950a1e7798f35
MD5 336c73e7d4f17f6705061f355e8c43a6
BLAKE2b-256 0ed54339811b04598a0965998deba061ec0d6fd7d7faa211ed33738a2c598d81

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 58d550cc7e2585b160a97ece3af57ab915b2bf9b322fd76df987cbc30084ca15
MD5 652a1200892f75141ceceb597ba67c4a
BLAKE2b-256 20d07314bea07720f5e6f4c641e61805e54e3e50d0b1feb37933829b63fe45c3

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 0fc1898dab8693260d3b3470fb9bbb3ea87dd63a2466cc4f5fe2d9911eec0b3a
MD5 2085f66af2da8a827535b60254b3b5f2
BLAKE2b-256 557c9ad4740d2cb4a1250c310aed8dd4afec63b93dd6463c0e25f84afab00e56

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1c01c61fdf08e5cdac91d7fc5eaea5800069e76952daafc76866327ce16bf529
MD5 3d195d0bb123735f3277ee69c4f11410
BLAKE2b-256 8125cc71815df2f572b0dd7e7a5946ab60dc64256e63f59c0576705ba1817010

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 2c0ffc16ecf3d677cb9701bd5b4be8ab4a33055e13df31537acbfc2113c8d248
MD5 dea189c4954cbfe6f5945a1313a1eb3b
BLAKE2b-256 642e4fa842bde414bb8855b7e7bc6f6096a01bad0d8002ebf720fd670d9251b2

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 bf62a89a6cdeb0f097d64c20afa910b61a02467fcbadb6a6dcfb4c5c111f7708
MD5 44d144021f3f88fbe5b1da947e6a73f2
BLAKE2b-256 ce77cd6043337582e271005c9d209f6535b8620566d7fa8574179082bb3cb8b5

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 cc72ccd8f4db192b12debea3fc305e9154859ba98303698cbc53b8e9c9379b7e
MD5 ed241987e9edb66aba16a2906846f2ea
BLAKE2b-256 89ed77b88f55c0a4c7909926f5aebdd1336650d9dc52247d9ed0a56725b5c9b6

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 56bcb407958652fa7cdaa9026d552e1ea2aae937442c2cd1a0c74293a13814b0
MD5 0cf730356f0942d415e11183809f74c2
BLAKE2b-256 9cf3560398425c9fb076d9d4dc0a69c2ea864ea2511909c9bbdb2da374f88adb

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 16d263f4e5e396222edeca8505f001069d0e2392c31d4a57aa2802d1d2528d04
MD5 4603e0a567de1da54b58d6ec760a7125
BLAKE2b-256 656d6d0a319519ad0f12adccdd4a7a70bb70059c2f7e3769f6d4f7deb0fae132

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4d8104b94c6791949162304314a9ce3c36fec22402d5e2d7450168c0c6dee478
MD5 0d5b0851a54f0df5fbbbd5fe124d4c92
BLAKE2b-256 91418de1a6858707082ba31c9ceb2049ad0b403f7722943e1f257c298f2fc780

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 acf5463082d07d47bcb2eed6bff00d1815493ea959d0bcd6675eff3df6d7923e
MD5 e82ed71fc40d476dea0ae982e5f29243
BLAKE2b-256 7de58ef094a16ee27c4834cc4a9b2e40c8fb4bd652efb9299273574b8b601f7c

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d21ca70ff3f5ec509834a09742fe13306a5e95e06f27afc38b36646e8fa12a97
MD5 b12ccef34cb72b900410bc0dc1cee9d1
BLAKE2b-256 978cebc613739098de225cb09d160f9509334114ee03662ede7c7f4be6328106

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 0bb6cf395e634acd9a45be82cae94d0022b0593005e8deaab9fc642dab9222bd
MD5 9b794a0b1cc730f4c652544a3ee7fdbb
BLAKE2b-256 5e0a4808c27f7d94e7650f4db65df8111c24785952682d22decb2205aed82282

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d55922f28bb46da31b778daedafc5eaa3925376d3feaec6b7b6e31ff3bae2ecd
MD5 28ed3d4dcda54ae1c6437ea54155b005
BLAKE2b-256 f62672794b03f9df497876988681fdb1b1a9949fe94c1dc8d2d300f12ecf50dd

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 196b96d85bd9225fe8e47b38426772e6b6963c75e22e75a8c5df80bbe3ad82da
MD5 c65144099119e76325fa6775de33139e
BLAKE2b-256 01f25437c73e1c5e8ae0519ac50f79d0ffeb45d12e601c11fb2381361b4725ae

See more details on using hashes here.

File details

Details for the file occpy_ls-0.2.0rc2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for occpy_ls-0.2.0rc2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c432b19bc4621657a80eff70b77e588e859380c3e75e38f9ad9fee9e7d51b998
MD5 aa2a9a04b366d33c151f659e88b28bc7
BLAKE2b-256 6aaaf937d9904b066eb75d8d07a430ce771b95b9438b775ec60baa6465f60bc5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page