Torch-based Vegetation Radiative Transfer Model library (PROSPECT, SAIL, SMAC)
Project description
TorchRTM: A PyTorch-based Radiative Transfer Modeling Toolkit
TorchRTM is a GPU-accelerated, modular, and research-ready radiative transfer modeling (RTM) library built on top of PyTorch.
Features
It integrates:
- Leaf RTMs: PROSPECT-5B, PROSPECT-D, PROSPECT-PRO
- Canopy RTMs: 4SAIL, PROSAIL
- Atmospheric Modeling: SMAC (TOC→TOA conversion)
- LUT Tools: Torchlut (LUT generator), Torchlut_pred (GPU KNN retrieval)
- High Performance: Batch computation, CUDA support
TorchRTM is ideal for remote sensing, vegetation trait retrieval, radiative transfer simulation, environmental monitoring, machine learning and RTM-based inversion workflows.
🌟 Key Features
- Full PROSPECT + PROSAIL + SMAC integration
- GPU-accelerated simulations for millions of samples
- High-performance LUT generator (Torchlut)
- Fast KNN-style LUT-based retrieval engine (Torchlut_pred)
- Supports PROSPECT-5B / D / PRO
- Supports TOC and TOA reflectance
- Fully compatible with PyTorch pipelines and deep learning models
📦 Installation
pip install torchrtm
Requirements
- Python ≥ 3.9
- PyTorch ≥ 1.12
(Install PyTorch based on your hardware: https://pytorch.org/get-started/locally/)
📘 User Guide
This README includes complete usage documentation for:
- PROSPECT / PROSAIL model
- SMAC atmospheric correction
- Torchlut – fast LUT generator
- Torchlut_pred – GPU KNN retrieval
- Full inversion pipeline (LUT → retrieval)
1. PROSAIL / PROSPECT Model Usage
TorchRTM provides a unified PROSAIL implementation:
from torchrtm.models import prosail
rho, tau = prospectd(traits, N, alpha=alpha, print_both=True)
# also supports: prospect5b / prospectpro
Parameters
| Name | Type | Description |
|---|---|---|
traits |
Tensor (batch, n_traits) |
Leaf biochemical parameters (Cab, Car, Cbrown, Cw, Cm, etc.) |
N |
Tensor (batch) |
Leaf structure parameter |
alpha |
Tensor | Leaf transmittance parameter (commonly 40°) |
print_both |
bool | if print both of rho,tau, if not only output rho |
Notes
traits must be provided in a fixed parameter order depending on the PROSPECT model version:
- prospect5b →
Cab, Car, Cbrown, Cw, Cm - prospectd →
Cab, Car, Cbrown, Cw, Cm, Canth - prospectpro →
Cab, Car, Cbrown, Cw, Canth, Prot, Cbc
Returns
A tensor of size:
(batch, spectral_length), (batch, spectral_length)
Containing:
| Output | Description |
|---|---|
rho |
Leaf reflectance spectrum computed by the PROSPECT model (leaf-scale bi-directional reflectance). |
tau |
Leaf transmittance spectrum computed by the PROSPECT model (leaf-scale bi-directional transmittance). |
prosail(
traits, N, LIDFa, LIDFb, lai, q,
tts, tto, psi, tran_alpha, psoil,
batch_size=0, prospect_type='prospect5b', lidtype=1
)
Parameters
| Name | Type | Description |
|---|---|---|
traits |
Tensor (batch, n_traits) |
Leaf biochemical parameters (Cab, Car, Cbrown, Cw, Cm, etc.) |
N |
Tensor (batch) |
Leaf structure parameter |
LIDFa / LIDFb |
Tensor | Leaf inclination distribution parameters |
lai |
Tensor | Leaf Area Index |
q |
Tensor | Hotspot parameter |
tts |
Tensor | Solar zenith angle (degrees) |
tto |
Tensor | Observer zenith angle (degrees) |
psi |
Tensor | Relative azimuth angle (degrees) |
tran_alpha |
Tensor | Leaf transmittance parameter (commonly 40°) |
psoil |
Tensor | Soil moisture parameter |
batch_size |
int | Processing batch size (for GPU memory control) |
prospect_type |
str | "prospect5b", "prospectd", "prospectpro" |
lidtype |
int | LIDF type (1–4) |
Returns
A tensor of size:
(batch, spectral_length, 7)
Containing:
| Output | Description |
|---|---|
| RDDT | Reflectance for Diffuse-Downward Transmission |
| RSDT | Reflectance for Solar-Downward Transmission |
| RDOT | Reflectance for Diffuse Outgoing Transmission |
| RSOT | Reflectance for Solar Outgoing Transmission |
| TSD | Total Solar Downward Transmission |
| TDD | Total Diffuse Downward Transmission |
| RDD | Reflectance for Diffuse-Downward Irradiance |
Example: Simulating Canopy Reflectance
from torchrtm.models import prosail
import torch
B = 5000
device = "cuda"
traits = torch.rand(B, 5).to(device)
N = torch.rand(B).to(device)
LIDFa = torch.zeros(B).to(device)
LIDFb = torch.zeros(B).to(device)
lai = torch.ones(B).to(device) * 3
q = torch.ones(B).to(device) * 0.5
tts = torch.ones(B).to(device) * 30
tto = torch.ones(B).to(device) * 20
psi = torch.ones(B).to(device) * 10
alpha = torch.ones(B).to(device) * 40
psoil = torch.ones(B).to(device) * 0.5
toc = prosail(
traits, N, LIDFa, LIDFb, lai, q,
tts, tto, psi, alpha, psoil,
batch_size=5000,
prospect_type="prospect5b",
lidtype=2
)
print(toc.shape)
2. SMAC Atmospheric Correction
TorchRTM supports SMAC for TOA correction.
from torchrtm.atmosphere.smac import smac
from torchrtm.data_loader import load_smac_sensor
Example
coefs, sm_wl = load_smac_sensor("S2A")
Ta_s, Ta_o, T_g, ra_dd, ra_so, ta_ss, ta_sd, ta_oo, ta_do = smac(
tts=torch.tensor([30.0]),
tto=torch.tensor([20.0]),
psi=torch.tensor([10.0]),
coefs=coefs
)
TOC → TOA Conversion
from torchrtm.atmosphere.smac import toc_to_toa
R_TOC, R_TOA = toc_to_toa(
toc, sm_wl - 400,
ta_ss, ta_sd, ta_oo, ta_do,
ra_so, ra_dd, T_g
)
3. Torchlut: High-Performance LUT Generator
Generates millions of simulated samples using PROSPECT, PROSAIL, or ATOM.
from torchrtm.utils.torch_utils import Torchlut
Torchlut() — API Documentation
Torchlut(
model='prospect5b',
table_size=500000,
std=0,
batch=10000,
wavelength=None,
sensor_name='LANDSAT4-TM',
sail_prospect='prospectd',
use_atom=False,
para_addr=None
)
Parameters
| Parameter | Description |
|---|---|
model |
"prospect5b", "prospectd", "prospectpro", "prosail" |
table_size |
Number of samples to generate |
std |
Gaussian noise standard deviation |
batch |
Simulation batch size |
wavelength |
Select specific wavelength indices |
sensor_name |
Used when use_atom=True |
sail_prospect |
Leaf model used inside PROSAIL |
use_atom |
Enable ATOM (PROSAIL + SMAC) |
para_addr |
Parameter range configuration |
Notes
When use_atom=True, the following sensor spectral-response files are supported:
LANDSAT4-TMLANDSAT5-TMLANDSAT7-ETMLANDSAT8-OLISentinel2A-MSISentinel2B-MSISentinel3A-OLCISentinel3B-OLCITerraAqua-MODIS
Returns
ref_list # reflectance (TOC or TOA)
para_list # parameter vectors
Torchlut Example
ref, params = Torchlut(
model="prospectd",
table_size=100000,
batch=5000,
std=0.01
)
4. Torchlut_pred: GPU KNN Retrieval Engine
Fast, block-wise KNN retrieval optimized for LUT inversion.
from torchrtm.utils.torch_utils import Torchlut_pred
Torchlut_pred() — API Documentation
Torchlut_pred(
xb, xq, y,
k=5,
distance_order=2,
xb_block=1000,
batch_size=200,
device="cuda"
)
Parameters
| Parameter | Description |
|---|---|
xb |
Database features (N, D) |
xq |
Query features (M, D) |
y |
Database target values (N,) or (N, D_out) |
k |
Number of nearest neighbors |
distance_order |
1 = Manhattan, 2 = Euclidean, etc. We recommand to set it as 9 |
xb_block |
Split xb to avoid GPU OOM |
batch_size |
Query batch size |
device |
cuda / cpu |
Returns
preds: shape (M,) or (M, D_out)
How It Works (Internal Mechanism)
- Move all tensors to GPU
- Process queries in batches
- For each batch, iterate through xb in blocks
- Compute distance matrix efficiently (when distance_order = 2): $$|x-y| = \sqrt{x^2 + y^2 - 2xy}$$
- Select global top-k neighbors
- Average retrieved y-values
Supports multi-million LUT inference on consumer GPUs.
Example
preds = Torchlut_pred(
xb=torch.tensor(ref),
xq=torch.tensor(query_ref),
y=torch.tensor(params),
k=5,
device="cuda"
)
5. Complete Retrieval Pipeline
# Step 1: Build LUT
ref_lut, para_lut = Torchlut(model="prospectd", table_size=300000)
# Step 2: Convert measured reflectance to tensor
xq = torch.tensor(measured_ref)
# Step 3: KNN retrieval
pred = Torchlut_pred(
xb=torch.tensor(ref_lut),
xq=xq,
y=torch.tensor(para_lut),
k=5
)
🤝 Contributing
PRs and issues are welcome!
Please include tests and clear descriptions.
📜 License
MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchrtm-1.3.3.tar.gz.
File metadata
- Download URL: torchrtm-1.3.3.tar.gz
- Upload date:
- Size: 475.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8211258376da55d6048a1e9542cc1577cb4c4b4abdec0c73454138309e6f04d0
|
|
| MD5 |
a3540e9c1aaf865d7ab61eac18a6aa1e
|
|
| BLAKE2b-256 |
cc01f10ea5c31161da646015775184c2909816ea83c86fd861e142c0a5e776b8
|
File details
Details for the file torchrtm-1.3.3-py3-none-any.whl.
File metadata
- Download URL: torchrtm-1.3.3-py3-none-any.whl
- Upload date:
- Size: 476.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
039c22357ae7097defb31d4aec019802cc638c97ffccb0214965a3e9285e9bed
|
|
| MD5 |
3267e93a1bfeb8a0a4694653f1540ddd
|
|
| BLAKE2b-256 |
7e6cf1eae32d53c44569ff953897e8d0f874a637cba3c5be6eaf43af557ae3ec
|