Skip to main content

KKT-HardNet: Physics-constrained neural networks with hard nonlinear equality and inequality constraint satisfaction guarantees

Project description

kkt-hardnet

kkt-hardnet is the publishable Python package for KKT-HardNet.

Install

Editable install from this repository:

pip install -e kkthn

Editable install with CUDA 12 support:

pip install -e "kkthn[cuda12]"

Or use pip after publishing:

pip install kkt-hardnet
pip install "kkt-hardnet[cuda12]"

Import

from kkthn import KKTHardNet

Core Methods

  • model() for supervised surrogate learning from parameters.csv and variables.csv
  • optimize() for unsupervised optimization from parameters.csv
  • estimate() for inverse parameter estimation from parameters.csv and variables.csv
  • load(metadata_path) to reload a trained run
  • predict(x) to infer variables for new parameter values

Packaging

The PyPI project name is kkt-hardnet, while the import path remains kkthn. The default install is CPU-oriented; CUDA support is selected explicitly with the cuda12 extra instead of automatic device detection.

Modeling Workflow

Use KKTHardNet to define symbolic constrained problems with:

  • named parameters, decision variables, and optional inverse parameters
  • equality and inequality constraints written as Python expressions
  • optional objectives for optimization tasks
  • hard KKT projection during training and inference
  • saved run artifacts including weights, predictions, history, summary, and metadata

Available methods:

  • dataset(parameters=..., variables=...) to attach training data
  • model() for supervised surrogate learning
  • optimize() for unsupervised optimization
  • estimate() for inverse parameter estimation
  • load(metadata_path) to reload a trained run
  • predict(x) to evaluate a trained or loaded model on new inputs

The example below shows one complete workflow. The CSV column names must match the declared parameter and variable names. Each run writes <model_name>_<timestamp>/ with parameters.csv, optional variables.csv, history.csv, predictions.csv, model_weights.npz, summary.json, and metadata.json.

from kkthn import KKTHardNet

TRAIN = {
    "epochs": 1200,
    "batch_size": 32,
    "learning_rate": 1e-3,
    "train_frac": 0.8,
    "hidden_size": 64,
    "hidden_layers": 2,
    "seed": 42,
    "dtype": "float64",
    "print_every": 1,
    "newton_step_length": 0.5,
    "newton_tol": 1e-6,
    "newton_reg_factor": 1e-3,
    "max_newton_iter": 30,
    "max_backtrack_iter": 10,
}

# Build a symbolic problem.
model = KKTHardNet(name="demo_model", train=TRAIN)
x = model.add_parameter(["x1", "x2"])
theta = model.add_inverse_parameter(["a0", "a1"], init_value=[10.0, -10.0])
y = model.add_variable(["y1", "y2", "y3"])

# Objectives are optional for surrogate modeling and inverse estimation,
# but required for optimize().
model.objective = 0.5 * (y.y1**2 + y.y2**2 + y.y3**2)
model.constraints.add(
    theta.a0 * y.y1 + y.y2 - x.x1 == 0,
    y.y2 - theta.a1 * y.y3 - x.x2 == 0,
    y.y1**2 + y.y3**2 <= 2.0,
    y.y1 >= 0,
)

# Attach data.
# For model() or estimate(): provide both parameters and variables.
# For optimize(): provide only parameters.
model.dataset(parameters="parameters.csv", variables="variables.csv")

# Choose one training mode.
surrogate_result = model.model()
# optimize_result = model.optimize()
# estimate_result = model.estimate()

# Reload a saved run later and predict on a new parameter vector.
reloaded = KKTHardNet()
reloaded.load("demo_model_20260414_120000/metadata.json")
prediction = reloaded.predict([1.0, 2.0])

⚠️ Please cite our work if you use this code in your research. Citation formats are provided below.

arXiv Preprint: https://arxiv.org/pdf/2507.08124 Journal: https://doi.org/10.1016/j.compchemeng.2025.109418

@article{iftakher2025physics,
  title={Physics-informed neural networks with hard nonlinear equality and inequality constraints},
  author={Iftakher, Ashfaq and Golder, Rahul and Nath Roy, Bimol and Hasan, MM Faruque},
  journal={Computers \& Chemical Engineering},
  pages={109418},
  year={2025},
  publisher={Elsevier}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kkt_hardnet-1.0.1.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kkt_hardnet-1.0.1-py3-none-any.whl (35.8 kB view details)

Uploaded Python 3

File details

Details for the file kkt_hardnet-1.0.1.tar.gz.

File metadata

  • Download URL: kkt_hardnet-1.0.1.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for kkt_hardnet-1.0.1.tar.gz
Algorithm Hash digest
SHA256 f70978528112373fe162fe4dfbfd8b44a5ec3920b5f9b9117fd60638651a4a9c
MD5 2cd07bfc715a6fe2d26d51efa8bb7377
BLAKE2b-256 0aeebd3e11b5441df365b7dd5c6eee226584e77891a72e2a58595d8d8969d386

See more details on using hashes here.

File details

Details for the file kkt_hardnet-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: kkt_hardnet-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 35.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.25

File hashes

Hashes for kkt_hardnet-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5d500ace7648ac69783d521a9e640ddda0912dd61240eef3c0163668bb8232fe
MD5 ee60a8c26e673534d62e07a9a0c7359a
BLAKE2b-256 176d573c7d2f0e567c233626d235bda9a7a91755a4f820b74654489e7b0a0dcd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page