Skip to main content

A PyTorch-native library for hybrid crop modeling.

Project description

torchcrop

image image

A PyTorch-native library for hybrid crop modeling.

torchcrop is a fully differentiable reimplementation of the Lintul5 crop growth model (Wolf, 2012). Every step of the simulation — from sowing to harvest — produces valid torch.autograd gradients, so mechanistic crop processes can be combined seamlessly with learnable components (neural residuals, learned stress responses, parameter networks) and calibrated end-to-end with standard torch.optim optimizers.

Features

  • Differentiable Lintul5 — daily forward-Euler simulation of phenology, radiation interception, photosynthesis, partitioning, leaf and root dynamics, water balance and NPK uptake, all as torch.nn.Modules.
  • Batch-first — every state, parameter and driver carries a leading batch dimension [B, ...] so that many sites, years, or parameter sets can be simulated in parallel on GPU.
  • Hybrid modeling hooks — drop-in NeuralResidual, LearnedStressFactor and ParameterNet modules that plug into the mechanistic pipeline.
  • Smooth options — stage-based branching (DVS < 1, maturity, etc.) can be switched between hard torch.where and sigmoid blends for second-order smoothness.
  • Gradient-checked primitives — differentiable AFGEN-style interpolation and soft FST helpers (LIMIT, INSW, NOTNUL) pass torch.autograd.gradcheck.

Installation

pip install torchcrop

Quickstart

import torch
import torchcrop
from torchcrop.utils.io import make_constant_weather

weather = make_constant_weather(batch_size=2, n_days=150)
model = torchcrop.Lintul5Model()
output = model(weather, start_doy=60)

print(output.yield_)        # [B] final storage-organ biomass (g m-2)
print(output.lai.shape)     # [B, T+1] LAI trajectory
print(output.dvs.shape)     # [B, T+1] development stage trajectory

Gradient-based parameter calibration

import torch.nn as nn
from torchcrop import Lintul5Model, CropParameters

crop = CropParameters().to(dtype=torch.float64)
crop.rue = nn.Parameter(torch.tensor(3.0, dtype=torch.float64))

model = Lintul5Model(crop_params=crop).double()
optimizer = torch.optim.Adam([crop.rue], lr=1e-2)

for _ in range(50):
    optimizer.zero_grad()
    out = model(weather.to(torch.float64), start_doy=60)
    loss = ((out.yield_ - observed_yield) ** 2).mean()
    loss.backward()
    optimizer.step()

Hybrid modeling

Inject a neural residual on top of the mechanistic photosynthesis output:

from torchcrop.nn import NeuralResidual

residual = NeuralResidual(input_dim=8, output_dim=1, hidden_dim=32, scale=0.1)
hybrid = torchcrop.Lintul5Model(
    residual_modules={"photosynthesis": residual},
)

All parameters — mechanistic and neural — are surfaced by hybrid.parameters() and can be optimized jointly.

Package layout

torchcrop/
├── model.py                   # Lintul5Model(nn.Module)
├── engine.py                  # SimulationEngine time-stepping loop
├── config.py                  # RunConfig
├── parameters/                # CropParameters / SoilParameters / SiteParameters
├── drivers/weather.py         # WeatherDriver [B, T, C]
├── states/model_state.py      # ModelState tensor container
├── processes/                 # Biophysical processes (astro, phenology,
│                              # irradiation, evapotranspiration, water_balance,
│                              # photosynthesis, partitioning, leaf_dynamics,
│                              # root_dynamics, nutrient_demand, stress)
├── functions/                 # Differentiable primitives (AFGEN, FST, smoothing)
├── nn/                        # NeuralResidual, LearnedStressFactor, ParameterNet
└── utils/                     # I/O, visualisation, validation helpers

Development

pytest                    # run the test suite
flake8 torchcrop tests    # lint
black torchcrop tests     # format
pre-commit run --all-files

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchcrop-0.0.1.tar.gz (44.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

torchcrop-0.0.1-py2.py3-none-any.whl (39.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file torchcrop-0.0.1.tar.gz.

File metadata

  • Download URL: torchcrop-0.0.1.tar.gz
  • Upload date:
  • Size: 44.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for torchcrop-0.0.1.tar.gz
Algorithm Hash digest
SHA256 4dc2acf28b7972b4b58d5d70249a82905c440b6d46df6a5600f3cefdc6511d85
MD5 dba603359c4b7a7f4f0f3fb378501c51
BLAKE2b-256 89d189cc1f9c6b69337b2e00f09b46da51de9b92b6298cffc9c287dc56d6e8c9

See more details on using hashes here.

File details

Details for the file torchcrop-0.0.1-py2.py3-none-any.whl.

File metadata

  • Download URL: torchcrop-0.0.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for torchcrop-0.0.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 318e4a8ecd23229c88e6d4f5247dca6cb01a7220548ac0e5d7d50e29f44d8f98
MD5 f0c2f809d6140496237b442e53fc2254
BLAKE2b-256 282d3baab7b2af9e06f27a675a91b7b442bfde8a606023a5394a8f45da512804

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page