Skip to main content

PSANN: Parameterized Sine-Activated Neural Networks (primary-output, sklearn-style, PyTorch backend)

Project description

PSANN - Parameterized Sine-Activated Neural Networks

PSANN packages sine-activated Torch models behind a sklearn-style estimator surface. The stack combines:

  • learnable sine activations with SIREN-friendly initialisation,
  • optional learned sparse (LSM) expanders and scalers,
  • persistent state controllers for streaming inference, and
  • Horizon-Informed Sampling Strategy Optimisation (HISSO) for episodic training.

The current line targets primary outputs only so there are no predictive extras, secondary heads, or legacy growth schedules to maintain.

Quick links:

  • API reference: docs/API.md
  • Scenario walkthroughs: docs/examples/README.md
  • Migration notes: docs/migration.md
  • Contributor guide: docs/CONTRIBUTING.md
  • Technical design notes: TECHNICAL_DETAILS.md

Installation

python -m venv .venv
.\.venv\Scripts\Activate.ps1   # Windows PowerShell
# source .venv/bin/activate     # macOS/Linux
pip install --upgrade pip
pip install -e .                # editable install from source

Optional extras in pyproject.toml:

  • psann[sklearn]: adds scikit-learn conveniences for estimator mixins and metrics.
  • psann[viz]: plotting helpers used in benchmarks and notebooks.
  • psann[dev]: pytest, ruff, black.

Need pre-pinned builds (e.g. on Windows or air-gapped envs)? Use the compatibility constraints:

pip install -e . -c requirements-compat.txt

pyproject.toml is the authoritative dependency list. requirements-compat.txt mirrors the newest widely available wheels for NumPy, SciPy, and scikit-learn when you need lockstep installs.

Running Tests

Install the development extras in editable mode so the test suite imports the packaged code without manual sys.path tweaks:

pip install -e .[dev]
python -m pytest

The suite exercises the supported supervised, streaming, and HISSO flows. GPU-specific checks are skipped automatically when CUDA is unavailable.

Common linting commands:

python -m ruff check src tests examples
python -m ruff format --check .

Quick Start

Supervised regression

import numpy as np
from psann import PSANNRegressor

rs = np.random.RandomState(42)
X = np.linspace(-4, 4, 1000, dtype=np.float32).reshape(-1, 1)
y = 0.8 * np.exp(-0.25 * np.abs(X)) * np.sin(3.5 * X)

model = PSANNRegressor(
    hidden_layers=2,
    hidden_units=64,
    epochs=200,
    lr=1e-3,
    early_stopping=True,
    patience=20,
    random_state=42,
)
model.fit(X, y, verbose=1)
print("R^2:", model.score(X, y))

Behind the scenes the estimator normalises arguments via normalise_fit_args and prepares data/scalers through psann.estimators._fit_utils.prepare_inputs_and_scaler, so dense, residual, and convolutional variants share the same fit surface.

Episodic HISSO with HISSOOptions

import numpy as np
from psann import PSANNRegressor, get_reward_strategy, HISSOOptions

rng = np.random.default_rng(7)
X = rng.normal(size=(512, 4)).astype(np.float32)
targets = np.sin(X.sum(axis=1, keepdims=True)).astype(np.float32)

model = PSANNRegressor(hidden_layers=2, hidden_units=48, epochs=40, batch_size=64)
model.fit(X, targets, verbose=1)  # supervised warm start

finance = get_reward_strategy("finance")
options = HISSOOptions.from_kwargs(
    window=64,
    reward_fn=finance.reward_fn,
    context_extractor=finance.context_extractor,
    primary_transform="softmax",
    transition_penalty=0.05,
    input_noise=0.0,
    supervised={"y": targets},
)

model.fit(
    X,
    y=None,
    hisso=True,
    hisso_window=options.episode_length,
    hisso_reward_fn=options.reward_fn,
    hisso_context_extractor=options.context_extractor,
    hisso_primary_transform=options.primary_transform,
    hisso_transition_penalty=options.transition_penalty,
    hisso_supervised=options.supervised,
    verbose=1,
)

HISSOOptions keeps reward, context, noise, and transformation choices in one place. The estimator records the resolved options after fitting so helpers such as psann.hisso.hisso_infer_series and psann.hisso.hisso_evaluate_reward can reuse them.

Custom data preparation

from psann import PSANNRegressor
from psann.estimators._fit_utils import normalise_fit_args, prepare_inputs_and_scaler

est = PSANNRegressor(hidden_layers=1, hidden_units=16, scaler="standard")
fit_args = normalise_fit_args(est, X_train, y_train, hisso=False, verbose=0, lr_max=None, lr_min=None)
prepared, primary_dim, _ = prepare_inputs_and_scaler(est, fit_args)
# prepared.train_inputs / prepared.train_targets feed straight into custom loops

This keeps bespoke research loops aligned with the estimator's preprocessing contract without relying on deprecated extras heads.

Core components

  • Sine activations (psann.activations.SineParam) expose learnable amplitude, frequency, and decay with optional bounds and SIREN-friendly initialisation.
  • LSM expanders (psann.lsm) provide sparse learned feature maps; build_preprocessor wires dict specs or modules into estimators with optional pretraining and separate learning rates.
  • State controllers (psann.state.StateController) keep per-feature persistent gains for streaming/online workflows. Configurable via StateConfig.
  • Shared fit helpers (psann.estimators._fit_utils) normalise arguments, materialise scalers, route through residual and convolutional builders, and orchestrate HISSO plans.
  • Wave backbones (psann.models) surface WaveResNet, WaveEncoder, WaveRNNCell, and scan_regimes for standalone experiments and spectral diagnostics outside the sklearn wrappers.
  • HISSO (psann.hisso) offers declarative reward configuration (HISSOOptions), supervised warm starts, episode construction, and inference helpers that reuse the cached configuration.
  • Utilities (psann.utils) include Jacobian/NTK probes, participation ratio, mutual-information proxies, and linear probes for diagnostics.
  • Token helpers (SimpleWordTokenizer, SineTokenEmbedder) remain for experiments that need sine embeddings, but no language-model trainer ships in this release.

HISSO at a glance

  1. Call HISSOOptions.from_kwargs(...) (or supply equivalent kwargs to fit) to resolve episode length, reward function, primary transform, transition penalty, context extractor, and optional noise.
  2. Provide hisso_supervised to run a warm-start supervised phase before episodic optimisation.
  3. PSANNRegressor.fit(..., hisso=True, ...) builds the episodic trainer using the shared fit pipeline.
  4. After training, hisso_infer_series(estimator, series) and hisso_evaluate_reward(estimator, series, targets=None) reuse the cached configuration to score new data.

The project ships CPU benchmark baselines (docs/benchmarks/) and CI scripts (scripts/benchmark_hisso_variants.py, scripts/compare_hisso_benchmarks.py) to catch HISSO regressions.

Docs and examples

  • Examples live in examples/; see docs/examples/README.md for the curated list (supervised, streaming, HISSO, benchmarks, diagnostics).
  • Detailed internals are captured in TECHNICAL_DETAILS.md.
  • Reward registry usage and custom strategy registration are described in docs/API.md under the HISSO section.

Current status and roadmap

  • Predictive extras and growth schedules are gone; legacy extras_* arguments are accepted but ignored with warnings for backward compatibility.
  • Terminology has converged on transition_penalty within HISSO; the trans_cost alias still functions but will be removed in a later release.
  • CPU benchmarks run in CI; GPU baselines remain on the roadmap once shared hardware is available.
  • Upcoming work highlighted in CLEANUP_TODO.md includes broader reward coverage, lint/type sweeps, and release tooling improvements.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psann-0.10.1.tar.gz (97.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

psann-0.10.1-py3-none-any.whl (78.2 kB view details)

Uploaded Python 3

File details

Details for the file psann-0.10.1.tar.gz.

File metadata

  • Download URL: psann-0.10.1.tar.gz
  • Upload date:
  • Size: 97.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for psann-0.10.1.tar.gz
Algorithm Hash digest
SHA256 3bdff9ee30f148312eabe55a1d7ff517ac69b87b57b898d7f997b98beaab1724
MD5 ebc9f05c857cb9e32ae15fb31f48af7d
BLAKE2b-256 d3d3e7fc6f0b31a88b5e8f7032f18bf70f8190f3f94ecb8191ad51d8d0df5f26

See more details on using hashes here.

File details

Details for the file psann-0.10.1-py3-none-any.whl.

File metadata

  • Download URL: psann-0.10.1-py3-none-any.whl
  • Upload date:
  • Size: 78.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for psann-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6005a8df3d1bc402d4d19dc115434eb86e7d8299641252caff8302faefff31b6
MD5 a93597c4aeb1b19739b4923b5ec63bf8
BLAKE2b-256 62b2f21ef4947f79734f19d8e5b81e0dc8a4a7f170e16e25a07bff9b14d13b3c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page