PyTorch reimplementation of the SCOPE radiative transfer model.
Project description
SCOPE
PyTorch-first implementation of the SCOPE canopy radiative transfer model for reflectance, fluorescence, thermal radiance, and coupled energy-balance workflows.
What It Is
scope is designed for users who need:
- asset-backed SCOPE physics in Python
- batched ROI/time execution on
xarraydatasets - differentiable model components in PyTorch
- reproducible MATLAB parity checks in CI and local development
The current implementation supports:
- leaf optics through FLUSPECT
- canopy reflectance through 4SAIL-based transport
- layered fluorescence and thermal radiative transfer
- leaf biochemistry and coupled energy balance
- directional and vertical-profile outputs on the homogeneous canopy path
- ROI/time workflows with
xarrayinput and output assembly
Attribution
This package is a Python implementation of the original MATLAB SCOPE model:
- Soil Canopy Observation, Photochemistry and Energy fluxes (SCOPE)
- original repository: Christiaanvandertol/SCOPE
- upstream manual: scope-model.readthedocs.io
Please attribute the original SCOPE model and papers when using this package in research workflows:
- Van der Tol, C., Verhoef, W., Timmermans, J., Verhoef, A., and Su, Z. (2009), Biogeosciences 6, 3109-3129
- Yang, P., Prikaziuk, E., Verhoef, W., and Van der Tol, C. (2021), Geoscientific Model Development 14, 4697-4712
Install
Published package name:
python -m pip install SCOPE-RTM
Import name:
import scope
Top-level CLI:
scope --help
scope fetch-upstream --help
scope prepare --help
scope run --help
scope vars Cab
scope vars --workflow fluorescence
scope vars --related Rntot
1. Clone the repository
git clone https://github.com/MarcYin/SCOPE scope
cd scope
2. Fetch the pinned upstream SCOPE assets
python scripts/fetch_upstream_scope.py
If you installed the package in an environment already, the same helper is available as:
scope-fetch-upstream
3. Create an environment and install
python -m venv .venv
source .venv/bin/activate
python -m pip install --upgrade pip
python -m pip install -e ".[dev]"
4. Verify the install
PYTHONPATH=src python examples/basic_scene_reflectance.py
PYTHONPATH=src python -m pytest -q tests/test_scope_benchmark_parity.py tests/test_scope_timeseries_benchmark_parity.py
5-Minute Quickstart
Minimal scene reflectance run
PYTHONPATH=src python examples/basic_scene_reflectance.py
Expected output:
{
"product": "reflectance",
"dims": {"y": 1, "x": 1, "time": 1, "wavelength": 2001},
"rsot_650nm": 0.047138178221010914,
"rsot_865nm": 0.4100649627325952,
"rsot_1600nm": 0.26994893328935227
}
High-level workflow run
PYTHONPATH=src python examples/scope_workflow_demo.py
Expected output:
{
"product": "scope_workflow",
"components": [
"reflectance",
"reflectance_directional",
"reflectance_profile",
"fluorescence",
"fluorescence_directional",
"fluorescence_profile"
],
"rsot_650nm_t0": 0.04522854188089004,
"LoF_peak_t0": 1.985767010834904e-05,
"LoF_peak_wavelength_t0": 744.0
}
Prepared-dataset CLI run
scope prepare \
--weather weather.nc \
--observation observation.nc \
--bio-npz post_bio.npz \
--year 2020 \
--output scope_inputs.nc
scope run \
--input scope_inputs.nc \
--output scope_outputs.nc \
--scope-root ./upstream/SCOPE \
--workflow reflectance
Main Entry Points
For most users, the preferred entry points are:
ScopeGridRunner.run_scope_dataset(...)High-level reflectance/fluorescence/thermal workflow dispatch from preparedxarrayinputs.ScopeInferenceModelLightweight tensor-only inference surface for production workloads that only need selected outputs.prepare_scope_input_dataset(...)Build a runner-ready dataset from weather, observation, and Sentinel-2 bio inputs.validate_scope_dataset(...)Validate required variables, soil alternatives, and key dimensions before a workflow is executed.write_netcdf_dataset(...)Persist prepared or simulated outputs to NetCDF with safe backend selection and compression handling.
For direct lower-level use:
FluspectModelCanopyReflectanceModelCanopyFluorescenceModelCanopyThermalRadianceModelCanopyEnergyBalanceModel
Documentation Map
- Installation Guide
- Quickstart
- Model Mechanics
- Input / Output Reference
- Variable Glossary
- Workflow Variable Guides
- Examples
- Production Notes
- Releasing
- Benchmark Policy
Build the docs locally with:
python -m pip install -e ".[docs]"
mkdocs build --strict
Production Notes
- Asset-backed constructors such as
from_scope_assets(...)require an upstream SCOPE checkout. The recommended path isscope-fetch-upstream. - The installed CLI now covers the common shell workflow:
scope fetch-upstream,scope prepare, andscope run. - Prepared inputs and assembled outputs now carry glossary-derived
xarraymetadata such aslong_name,units,description,scope_category, andscope_relationship. - NetCDF exports are now CF-enriched with dataset-level
Conventions,title,source,references,history, and axis metadata on common coordinates. scope runvalidates workflow-specific inputs before execution, and the same validator is available directly asvalidate_scope_dataset(...).- The default CI suite runs parity tests in live-or-pregenerated mode. On machines without MATLAB, the tests compare against checked-in MATLAB fixtures.
- The self-hosted GPU and live-MATLAB lanes remain optional operational lanes; see docs/benchmark-policy.md.
- Documentation can be built locally with
mkdocs build --strictand is deployed by the dedicated GitHub Pages workflow. - Distribution artifacts can be built locally with
python -m buildand validated withpython -m twine check dist/*. - Release notes are drafted automatically on
main, and tagged releases publish PyPI artifacts plus GitHub artifact attestations.
Testing
Run the default suite with:
PYTHONPATH=src python -m pytest -q
The strongest automated checks currently include:
- kernel parity and execution-mode regression tests
- ROI/time runner consistency tests
- committed scene and time-series benchmark summary regression tests
- live-or-pregenerated MATLAB parity tests for the single-scene and time-series benchmark gates
Performance Benchmarking
Use the committed kernel benchmark harness to compare eager and compiled execution on your own hardware:
PYTHONPATH=src python scripts/benchmark_kernels.py \
--device cpu \
--dtype float64 \
--batch 32 \
--fixture scope-assets \
--mode compare
Current reference behavior on CPU with torch 2.10.0:
fluspectandreflectanceshow strong steady-state speedups undertorch.compile, but still require repeated same-shape calls to amortize compile cost.thermalspeeds up in steady state, but the compile break-even is much higher.- layered
fluorescencecurrently fails undertorch.compileon this environment. leaf_biochemistrycurrently becomes slower undertorch.compilebecause of scalar-control-flow graph breaks and recompilation churn.
Because of that mix, the package does not enable compiled execution by default.
Release Workflows
.github/workflows/release.ymlVerifies tag/version alignment, reruns the release-local CPU/docs gates, buildssdistand wheel artifacts forSCOPE-RTM, validates them withtwine check, smoke-installs both artifact types through the documentedscopeCLI surface, and then publishes to PyPI on version tags. Manual dispatch still supports TestPyPI or PyPI..github/workflows/docs.ymlBuilds the MkDocs site and deploys it to GitHub Pages.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scope_rtm-0.3.1.tar.gz.
File metadata
- Download URL: scope_rtm-0.3.1.tar.gz
- Upload date:
- Size: 124.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a6f24209f678856e18662c36b6533024b3787dd011fb3121adbf5447ae320b2e
|
|
| MD5 |
1c1dac30f49285d2bd937b55842981a7
|
|
| BLAKE2b-256 |
11ccb1e8db32a8c7789a3323cf7bb581daf639c5c58d744d0d4216a9c0b3b8d6
|
Provenance
The following attestation bundles were made for scope_rtm-0.3.1.tar.gz:
Publisher:
release.yml on MarcYin/SCOPE
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scope_rtm-0.3.1.tar.gz -
Subject digest:
a6f24209f678856e18662c36b6533024b3787dd011fb3121adbf5447ae320b2e - Sigstore transparency entry: 1323251440
- Sigstore integration time:
-
Permalink:
MarcYin/SCOPE@7357116c5baec00cacdef0d273fb334e48af64ff -
Branch / Tag:
refs/tags/v0.3.1 - Owner: https://github.com/MarcYin
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@7357116c5baec00cacdef0d273fb334e48af64ff -
Trigger Event:
push
-
Statement type:
File details
Details for the file scope_rtm-0.3.1-py3-none-any.whl.
File metadata
- Download URL: scope_rtm-0.3.1-py3-none-any.whl
- Upload date:
- Size: 105.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
19aad7b666792d34b8d12cd62735361af6a0e37e7f7a74e9ba2b1db39eb0c9fd
|
|
| MD5 |
f823536b70541b1099c03347c708b10e
|
|
| BLAKE2b-256 |
bd265a220515f7995c40e9ab1c245c444a147f1aef9ce46553049ae63d551d5f
|
Provenance
The following attestation bundles were made for scope_rtm-0.3.1-py3-none-any.whl:
Publisher:
release.yml on MarcYin/SCOPE
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scope_rtm-0.3.1-py3-none-any.whl -
Subject digest:
19aad7b666792d34b8d12cd62735361af6a0e37e7f7a74e9ba2b1db39eb0c9fd - Sigstore transparency entry: 1323251504
- Sigstore integration time:
-
Permalink:
MarcYin/SCOPE@7357116c5baec00cacdef0d273fb334e48af64ff -
Branch / Tag:
refs/tags/v0.3.1 - Owner: https://github.com/MarcYin
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@7357116c5baec00cacdef0d273fb334e48af64ff -
Trigger Event:
push
-
Statement type: