Feedforward neural network library for 5D to 1D interpolation.
Project description
interpy_bg
interpy_bg is a feedforward neural network library designed for 5D → 1D interpolation.
It provides modular classes for defining, training, and testing neural networks, with built-in normalization, RMSE tracking, and plotting utilities.
Features
- Feedforward neural networks with customizable hidden layers
- L2 regularization
- Training with RMSE tracking and validation split
- Normalization of input data
- Save/load trained weights, normalization values, and model metadata
- Simple plotting of training/validation loss and predictions
- Dataset validation/standardisation with train/val/test splits
- Synthetic 5D data generator utilities (via
interpy_synthdependency)
Installation
Local/dev (installs interpy_bg + interpy_synth + fivedreg_tf from PyPI):
cd backend
pip install -r requirements.lock
PyPI:
pip install interpy_bg # NumPy backend (pulls interpy-synth)
pip install fivedreg_tf # TF backend (pulls interpy-synth + tensorflow)
Docker:
cd ..
./scripts/docker_build.sh
./scripts/docker_up.sh # backend on :8000
# ./scripts/docker_down.sh to stop
Environment:
- Configure CORS via
ALLOWED_ORIGINS(comma-separated), e.g. copybackend/.env.example. - CPU-only: no GPU required; TensorFlow uses the CPU build.
- For reproducibility, prefer
requirements.lock.
Quick Start
Training a model
import numpy as np
from interpy_bg.trainer import Trainer
import os, pickle
# Dummy dataset
X = np.random.rand(50, 5)
y = np.random.rand(50, 1)
# assign output directory
output_dir = os.path.join("outputs_numpy")
os.makedirs(output_dir, exist_ok=True)
# Save training data to a pickle file as a dictionary with keys "X" and "y"
train_pkl = os.path.join(output_dir, "train_data.pkl")
with open(train_pkl, "wb") as f:
pickle.dump({"X": X, "y": y}, f)
# Initialize trainer
trainer = Trainer(
directory=output_dir,
hidden_sizes=[16, 8],
Lambda=0.01, # not required, default value set as 0.01
epochs=300, # reduce for quicker runs
learning_rate=0.01, # not required, default value set as 0.01
train_val_split=0.8, # not required, default value set as 0.8
beta1=0.9, # not required, default value set as 0.9
beta2=0.999, # not required, default value set as 0.999
epsilon=1e-8, # not required, default value set as 1e-8
activation="relu", # optional: sigmoid/tanh/relu/leakyrelu
weight_init="auto", # optional: auto/he/xavier
batch_size=32, # optional: mini-batching
grad_clip=5.0, # optional: gradient clipping
early_stop_patience=20, # optional: early stopping
lr_decay=0.98, # optional: LR decay per epoch
seed=42, # optional: reproducibility
)
# Train model using the pickle file path
train_loss, val_loss = trainer.train(train_pkl)
Testing a model
from interpy_bg.tester import Tester
# Use the same output directory where the model was saved
output_dir = os.path.join("outputs_numpy")
tester = Tester(
hidden_sizes=[16, 8],
Lambda=0.01,
directory=output_dir,
activation="relu",
weight_init="auto",
)
predictions = tester.predict(X) # Can also pass a .pkl file with test data
Plotting results
from interpy_bg.plotter import plot_loss, plot_predictions
output_dir = os.path.join("outputs_numpy")
plot_loss(train_loss, val_loss, "rmse_vs_epochs.png", output_dir)
plot_predictions(y, predictions, "ytrue_vs_ypred.png", output_dir)
Synthetic data
from interpy_synth import synthetic_5d, synthetic_5d_pickle
# Generate arrays
X, y = synthetic_5d(1000, seed=42)
# Persist with metadata
path = synthetic_5d_pickle("outputs_numpy/synth.pkl", n=1000, seed=42)
Tests
- All backend tests live in
backend/tests/(API, NumPy, TensorFlow, synthetic, performance). - Run with
python -m pytest backend/tests.
Notes
- NumPy training writes
model_weights.npz,normalisation_values.npz, plots, andmodel_metadata.json(architecture, Lambda, activation/init, batch/clip/seed, best metrics incl. R²) intobackend/outputs_numpy/(when running via the API). - TensorFlow training (set
model_type=tfon/train) writesmodel_tf.keras,normalisation_values_tf.npz, plots, andtf_model_metadata.jsonintobackend/outputs_tf/(served alongside NumPy artifacts). - Prediction (
/predictorTester.predict) uses the trained architecture/config in metadata; client-supplied hidden sizes or Lambda are ignored./predictalso acceptsmodel_typeto choose NumPy vs TF. - API endpoints include
/health,/upload(accepts .pkl dict with X/y and returns dataset stats),/train,/predict,/plots/{filename},/artifacts/{filename}(serves NumPy or TF artifacts from their respective output folders), and/evaluate(prefers NumPy artifacts, falls back to TF if present). /resetclears uploads plus both output folders (backend/outputs_numpy/andbackend/outputs_tf/).- Plotting uses the headless
Aggbackend in both packages for compatibility with servers/CI.
Documentation
Full API documentation is hosted on ReadTheDocs. See details for every class, method and plotting utility.
License
MIT License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file interpy_bg-0.1.2.tar.gz.
File metadata
- Download URL: interpy_bg-0.1.2.tar.gz
- Upload date:
- Size: 3.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
72cb57348275df24cc4eaa743168a17ea292e1c3a0ca457d420fa93ff9fd2b39
|
|
| MD5 |
80ef92f6345c484e212a07d3f1ea8417
|
|
| BLAKE2b-256 |
72f204f62ad6da8fea77fa87d1c3c419f3992ebf6f7113bb9843b3cc3c57cfd8
|
File details
Details for the file interpy_bg-0.1.2-py3-none-any.whl.
File metadata
- Download URL: interpy_bg-0.1.2-py3-none-any.whl
- Upload date:
- Size: 3.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8c38664d2d4ae66570c731f843a0cfc274b2145d6ad0e6217b73f7a56c54d98d
|
|
| MD5 |
c2a273691cfa606b2e6302afdaf2e9ea
|
|
| BLAKE2b-256 |
6a4fea15eae90c5c15ebe3ab3290253eb380a2cc6c105bff246d6e3d29e5c5d3
|