Skip to main content

A simple tool for use pdal with parallel execution

Project description

Some processing on point clouds can be very time consuming, this problem can be solved by using several processes on a machine to run calculations in parallel. The pdal-parallelizer tool will allow you to fully use the power of your machine very simply to put it at the service of your processing.

pdal-parallelizer is a tool that allows you to process your point clouds through pipelines that will be executed on several cores of your machine. This tool uses the flexible open-source Python library Dask for the multiprocess side and allows you to use the power of the Point Data Abstraction Library, PDAL to write your pipelines.

It also protect you from any problem during the execution. Indeed, as the points clouds treatments can be really long, if something goes wrong during the execution you don’t want to restart this from the beginning. So pdal-parallelizer will serialize each pipeline to protect you from this.

Read the documentation for more details : https://pdal-parallelizer.readthedocs.io/

Installation

Using Pip

pip install pdal-parallelizer

Using Conda

conda install -c clementalba pdal-parallelizer

GitHub

The repository of pdal-parallelizer is available at https://github.com/meldig/pdal-parallelizer

Usage

Config file

Your configuration file must be like that :

{
    "input": "The folder that contains your input files (or a file path)",
    "output": "The folder that will receive your output files",
    "temp": "The folder that will contains your temporary files"
    "pipeline": "Your pipeline path"
}

Processing pipelines with API

from pdal_parallelizer import process_pipelines as process

process(config="./config.json", input_type="single", timeout=500, n_workers=5, diagnostic=True)

Processing pipelines with CLI

pdal-parallelizer process-pipelines -c <config file> -it dir -nw <n_workers> -tpw <threads_per_worker> -dr <number of files> -d
pdal-parallelizer process-pipelines -c <config file> -it single -nw <n_workers> -tpw <threads_per_worker> -ts <tiles size> -d -dr <number of tiles> -b <buffer size>

Requirements (only for pip installs)

Python 3.9+ (eg conda install -c anaconda python)

PDAL 2.4+ (eg conda install -c conda-forge pdal)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pdal-parallelizer-2.1.0.tar.gz (10.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pdal_parallelizer-2.1.0-py3-none-any.whl (12.3 kB view details)

Uploaded Python 3

File details

Details for the file pdal-parallelizer-2.1.0.tar.gz.

File metadata

  • Download URL: pdal-parallelizer-2.1.0.tar.gz
  • Upload date:
  • Size: 10.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for pdal-parallelizer-2.1.0.tar.gz
Algorithm Hash digest
SHA256 f20a5bf7e4ff2e064348801844a21b0ea6c25145713084f355f2e083a0cd1b5b
MD5 e6341d1ae155ab0198322001e8f9e89e
BLAKE2b-256 08d59d6376a44394296c0c5979015cbf8e002634e70096c48c13d3e2f5ffb53f

See more details on using hashes here.

File details

Details for the file pdal_parallelizer-2.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pdal_parallelizer-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 57b03683c17c14b91e2c4edfa5be3aa6a38ce4280857ef4bc96ef31602d3654b
MD5 c31be4c937c3db071089fde2253fe0f7
BLAKE2b-256 fc0be6f55c76500b8596f3f2a26121a3b94d93ed1c95dcc2efc90393ecb2a0fb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page