Skip to main content

No project description provided

Project description

pyForMetrix

Binder ReadTheDocs FWF Python tests

pyForMetrix is a Python package to extract metrics commonly used in forestry from laser scanning/LiDAR data. Main functionalities include a plot-based and a pixel-based calculation, and handling of large datasets.

Installation

pyForMetrix is packaged and delivered via PyPi, and can be installed using pip:

python -m pip install pyForMetrix

Getting started

Note: You can run this Getting started section on binder: Binder

First, we need a point cloud dataset. You can use your own or download a sample dataset, e.g. from the City of Vancouver: https://webtransfer.vancouver.ca/opendata/2018LiDAR/4830E_54560N.zip

Unzip this file after download to find a .las-File, which we will use in the following.

We need to read in the point cloud into a numpy array. Depending on the metrics we will derive later, different attributes also have to be loaded in. In this example, the 3D point cloud along with classification and echo number information is required. For reading in the file, we use laspy.

import numpy as np
import laspy

inFile = laspy.read(r"4830E_54560N.las")
coords = np.vstack([inFile.x,
                    inFile.y,
                    inFile.z]).transpose()
points = {
    'points': coords,
    'echo_number': inFile.return_number,
    'classification': inFile.classification
}

After importing the package pyForMetrics, we can create a RasterMetrics or a PlotMetrics object, depending on the application. Let's first work with RasterMetrics, which will calculate the set of metrics for each cell of a raster overlaid on the point cloud data.

from pyForMetrix.metrix import RasterMetrics
rm = RasterMetrics(points, raster_size=25)

The code above may take some time to run, as on the creation of theRasterMetrics object, the point cloud is rasterized to the final cells. The runtime will increase with more points and a smaller raster size.

We then select which metrics we want to calculate. pyForMetrix comes with a number of predefined metrics, convieniently grouped in two collections: publications, where metrics from different publications in the literature are taken, and types, which groups metrics by their type. Later, we will see how to create your own metric calculators. For now, we will use the ones presented by Woods et al. (2009):

from pyForMetrix.metricCalculators.publications import MCalc_Woods_et_al_2009
mc = MCalc_Woods_et_al_2009()
metrics = rm.calc_custom_metrics(metrics=mc)

With the last line, we created an xarray.DataArray object containing the metrics for each pixel:

print(metrics)
<xarray.DataArray (y: 115, x: 83, val: 26)>
array([[[ 1.19169000e+03,  1.19212000e+03,  1.19236000e+03, ...,
         -1.26632802e+00,  7.51640760e-01,  0.00000000e+00],
        [ 1.19254700e+03,  1.19255400e+03,  1.19256100e+03, ...,
         -2.00000000e+00,  1.00000000e+00,  0.00000000e+00],
...

Using rioxarray, we can save the values (here: the p90 metric, i.e., the 90th height percentile) to a raster file:

import rioxarray
metrics.sel(val='p90').rio.to_raster(f"p90.tif", "COG")

More examples

Multiple metric sets at once

Instead of passing a single metricCalculator class to calc_custom_metrics, you can call it with a list of metricCalculators:

from pyForMetrix.metricCalculators.types import MCalc_HeightMetrics, MCalc_DensityMetrics
heightMetrics = MCalc_HeightMetrics()
densityMetrics = MCalc_DensityMetrics()
metrics = rm.calc_custom_metrics(metrics=[heightMetrics, densityMetrics])

Override percentiles, custom options

Some metricCalculators can be customized, e.g. the MCalc_HeightMetrics accept an optional keyword percentiles, which replaces the percentiles calculated by default:

heightMetrics = MCalc_HeightMetrics(percentiles=np.array([15, 25, 50, 75, 85, 95, 99]))

Similarly, the cell size for the rumple index (e.g. in MCalc_White_et_al_2015) or the DSM in MCalc_Hollaus_et_al_2009 can be set - these variables are set as parameter to the __call__ function. calc_custom_metrics accepts them as a (list of) additional dictionaries with the settings:

from pyForMetrix.metricCalculators.publications import MCalc_White_et_al_2015, MCalc_Hollaus_et_al_2009 
whiteMetrics = MCalc_White_et_al_2015()
metrics = rm.calc_custom_metrics(metrics=whiteMetrics, metric_options={'rumple_pixel_size': 0.2})
hollausMetrics = MCalc_Hollaus_et_al_2009()
metrics = rm.calc_custom_metrics(metrics=[whiteMetrics, hollausMetrics], 
                                 metric_options=[
                                     {'rumple_pixel_size': 5},
                                     {'CHM_pixel_size': 7.5}
                                 ])

Parallelize metric computation

On computers with multiple cores, processing can be sped up significantly by multiprocessing. To this end, we provide a function calc_custom_metrics_parallel which takes similar arguments to calc_custom_metrics, but runs on multiple cores. Note that the parallelization is carried out over the raster cells, i.e., the multiple processes treat different subsets of the raster cells. As there is a certain overhead in starting the processes, speedup is only expected if there is a large enough number of (a) valid raster cells and (b) metrics that are complex to compute. The parameter multiprocessing_point_threshold checks the input point cloud and either spawns multiple processes (in case the number of points is larger than the threshold) or passes the arguments on to calc_custom_metrics.

The other parameters are n_chunks (default: 16), which is the number of blocks the raster cells are divided into to be processed, and n_processes (default: 4), which is the number of concurrent processes. A higher number of n_chunks uses less memory, but takes longer due to the overhead.

On systems with sufficient memory (RAM > (number of processes) x (max. size of a tile)), it is generally better to parallelize over input tiles rather than pixels.

Plotwise metric extraction

You can find an example notebook for plotwise metric extraction here, or

Binder

directly.

Full / API documentation

The full documentation can be found at readthedocs.

Dependencies

This package relies on the following packages (installed automatically when using pip). Thank you to all developers making this project possible!

Acknowledgement

This package has been developed in the course of the UncertainTree project, funded by the Austrian Science Fund (FWF) [Grant number J 4672-N].

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyformetrix-0.0.7.tar.gz (427.7 kB view details)

Uploaded Source

Built Distribution

pyformetrix-0.0.7-py3-none-any.whl (31.4 kB view details)

Uploaded Python 3

File details

Details for the file pyformetrix-0.0.7.tar.gz.

File metadata

  • Download URL: pyformetrix-0.0.7.tar.gz
  • Upload date:
  • Size: 427.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for pyformetrix-0.0.7.tar.gz
Algorithm Hash digest
SHA256 e8244755c280e456dfca26bf17d92807ff5960cf8770df59be3f209aaa8bc275
MD5 cee5f1dc3bba3aac67b3b6616bff3522
BLAKE2b-256 853f6ba51ce363993d97dacaf98ab02bcddafa20ba6bb7ec153b2e6953a42f07

See more details on using hashes here.

File details

Details for the file pyformetrix-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: pyformetrix-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 31.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for pyformetrix-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 8c1ecda3e8e60ce67b54a4da73f0783db06f218a4b0c3bef2fdad9446c35ad0d
MD5 570dbcb5d7dc8514085e698ed9f93600
BLAKE2b-256 9a6636e71c27dca09f60dd491c6621e13a66d60ea80d420ee879df7d2011e5b3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page