Skip to main content

A PyTorch-native library for hybrid crop modeling.

Project description

Welcome to torchcrop

Open in Colab Open in Binder Open In Studio Lab PyPI Version Downloads Documentation Status License

Introduction

torchcrop is a fully differentiable reimplementation of the LINTUL-5 crop growth model (Wolf, 2012). Every step of the simulation — from sowing to harvest — produces valid torch.autograd gradients, so mechanistic crop processes can be combined seamlessly with learnable components (neural residuals, learned stress responses, parameter networks) and calibrated end-to-end with standard torch.optim optimizers.

Features

  • Differentiable Lintul5 — daily forward-Euler simulation of phenology, radiation interception, photosynthesis, partitioning, leaf and root dynamics, water balance and NPK uptake, all as torch.nn.Modules.
  • Batch-first — every state, parameter and driver carries a leading batch dimension [B, ...] so that many sites, years, or parameter sets can be simulated in parallel on GPU.
  • Hybrid modeling hooks — drop-in NeuralResidual, LearnedStressFactor and ParameterNet modules that plug into the mechanistic pipeline.
  • Smooth options — stage-based branching (DVS < 1, maturity, etc.) can be switched between hard torch.where and sigmoid blends for second-order smoothness.
  • Gradient-checked primitives — differentiable AFGEN-style interpolation and soft FST helpers (LIMIT, INSW, NOTNUL) pass torch.autograd.gradcheck.

Installation

pip install torchcrop

Quickstart

import torch
import torchcrop
from torchcrop.utils.io import make_constant_weather

weather = make_constant_weather(batch_size=2, n_days=150)
model = torchcrop.Lintul5Model()
output = model(weather, start_doy=60)

print(output.yield_)        # [B] final storage-organ biomass (g m-2)
print(output.lai.shape)     # [B, T+1] LAI trajectory
print(output.dvs.shape)     # [B, T+1] development stage trajectory

Gradient-based parameter calibration

import torch.nn as nn
from torchcrop import Lintul5Model, CropParameters

crop = CropParameters().to(dtype=torch.float64)
crop.rue = nn.Parameter(torch.tensor(3.0, dtype=torch.float64))

model = Lintul5Model(crop_params=crop).double()
optimizer = torch.optim.Adam([crop.rue], lr=1e-2)

for _ in range(50):
    optimizer.zero_grad()
    out = model(weather.to(torch.float64), start_doy=60)
    loss = ((out.yield_ - observed_yield) ** 2).mean()
    loss.backward()
    optimizer.step()

Hybrid modeling

Inject a neural residual on top of the mechanistic photosynthesis output:

from torchcrop.nn import NeuralResidual

residual = NeuralResidual(input_dim=8, output_dim=1, hidden_dim=32, scale=0.1)
hybrid = torchcrop.Lintul5Model(
    residual_modules={"photosynthesis": residual},
)

All parameters — mechanistic and neural — are surfaced by hybrid.parameters() and can be optimized jointly.

Package layout

torchcrop/
├── model.py                   # Lintul5Model(nn.Module)
├── engine.py                  # SimulationEngine time-stepping loop
├── config.py                  # RunConfig
├── parameters/                # CropParameters / SoilParameters / SiteParameters
├── drivers/weather.py         # WeatherDriver [B, T, C]
├── states/model_state.py      # ModelState tensor container
├── processes/                 # Biophysical processes (astro, phenology,
│                              # irradiation, evapotranspiration, water_balance,
│                              # photosynthesis, partitioning, leaf_dynamics,
│                              # root_dynamics, nutrient_demand, stress)
├── functions/                 # Differentiable primitives (AFGEN, FST, smoothing)
├── nn/                        # NeuralResidual, LearnedStressFactor, ParameterNet
└── utils/                     # I/O, visualisation, validation helpers

Development

pytest                    # run the test suite
flake8 torchcrop tests    # lint
black torchcrop tests     # format
pre-commit run --all-files

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchcrop-1.0.0.tar.gz (714.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torchcrop-1.0.0-py2.py3-none-any.whl (48.4 kB view details)

Uploaded Python 2Python 3

File details

Details for the file torchcrop-1.0.0.tar.gz.

File metadata

  • Download URL: torchcrop-1.0.0.tar.gz
  • Upload date:
  • Size: 714.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for torchcrop-1.0.0.tar.gz
Algorithm Hash digest
SHA256 ba13e2415154b8518bdc8f49d5bd5256141cb73dfa025a77170078d2c050ec75
MD5 14d9d583c536cadff5115b77a29d9268
BLAKE2b-256 a8bb8fd3ec3b63949be8f5e3f25b2a55d3f173d3392b2889cb147408aebbff2d

See more details on using hashes here.

File details

Details for the file torchcrop-1.0.0-py2.py3-none-any.whl.

File metadata

  • Download URL: torchcrop-1.0.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 48.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for torchcrop-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 14669953b894e90a4b0dac995d5587243d9744ef3f707989aba5eb636f9eda75
MD5 a78dd33c5bd9037b11777ec1ba321614
BLAKE2b-256 6b5a7c83d12b713ab2331f90e1d1c7334afa23e392ba1762c51eb6c0b3d21fca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page