Skip to main content

Perturbed Saddle-point Descent optimizer for PyTorch

Project description

Perturbed Saddle-escape Descent (PSD)

CI Coverage Docs Python

Project Summary

This repository implements the Perturbed Saddle-escape Descent (PSD) algorithm for escaping saddle points in non-convex optimisation problems, as described in Alpay and Alakkad (2025). It contains reference NumPy implementations, framework specific optimisers for PyTorch and TensorFlow, and utilities for reproducing the synthetic experiments reported in the accompanying manuscript.

Features

  • Reference implementations of PSD, PSD-Probe and baseline gradient descent variants in pure NumPy.
  • Suite of analytic test functions with gradients and Hessians.
  • Synthetic data generator producing the tables and figures used in the paper (experiments.py).
  • Framework specific optimisers: PSDTorch, PSDTensorFlow and a PSDOptimizer/PerturbedAdam package for PyTorch.
  • Example training scripts for MNIST and CIFAR-10.

Technology Stack

The core project depends on the following libraries:

Library Purpose
numpy numerical routines for reference implementations
torch, torchvision deep-learning framework and datasets
optuna hyper-parameter search utilities
matplotlib visualisation in notebooks

Python 3.8 or later is required.

Installation

Install the published optimiser package:

pip install psd-optimizer

Or install the repository in editable mode for development:

git clone https://github.com/farukalpay/PSD.git
cd PSD
pip install -e ".[dev]"

Quick Start

import numpy as np
from psd import algorithms, functions

x0 = np.array([1.0, -1.0])
x_star, _ = algorithms.gradient_descent(x0, functions.SEPARABLE_QUARTIC.grad)

Further examples are available in the examples/ directory and the documentation.

Usage

Using the Reference Algorithms

The core PSD routines and test functions can be imported from the psd package:

import numpy as np
from psd import algorithms, functions

x0 = np.array([1.0, -1.0])
x_star, _ = algorithms.gradient_descent(x0, functions.SEPARABLE_QUARTIC.grad)

This structure allows you to experiment with the reference NumPy implementations directly in your projects.

Generating Synthetic Data

python experiments.py

The command writes CSV summaries to results/ and training curves to data/.

Performance

Profiling identified rosenbrock_hess as a hot path when computing the Rosenbrock Hessian. Vectorising the computation removed explicit Python loops and yielded the following improvements (dimension 1000):

Version Mean time (ms) Peak memory (MB)
Before 3.52 8.00
After 1.01 8.04

Benchmarking is automated via pytest-benchmark using a fixed NumPy seed. Hard time and memory thresholds guard against major regressions.

Training with the PyTorch Optimiser

from psd_optimizer import PSDOptimizer

model = ...
opt = PSDOptimizer(model.parameters(), lr=1e-3)

def closure():
    opt.zero_grad()
    output = model(x)
    loss = criterion(output, y)
    loss.backward()
    return loss

opt.step(closure)

Example scripts using this API are available in the notebooks/ directory.

Training a Small Language Model

An illustrative example for fine-tuning a compact transformer with PSDOptimizer is provided in scripts/train_small_language_model.py. The script downloads a tiny GPT-style model from the Hugging Face Hub and optimizes it on a short dummy corpus.

Run the example with default settings:

python scripts/train_small_language_model.py

Specify a different pretrained model and number of epochs:

python scripts/train_small_language_model.py --model distilgpt2 --epochs 5

Documentation

Full API documentation and guides are available in the docs/ directory. Additional materials include:

  • notebooks/10_minute_start.ipynb – an interactive notebook showcasing the optimiser.
  • docs/section_1_5_extension.md – theoretical notes on extending PSD to stochastic settings.
  • notebooks/navigation.ipynb – links to all example notebooks including advanced_usage.ipynb.

Testing

After installing the repository in editable mode, run the test suite to verify that everything works:

pytest

The current suite is small but helps prevent regressions.

Repository Structure

psd/              # Reference implementations and framework-specific optimisers
    algorithms.py # PSD and baseline algorithms
    functions.py  # Analytic test functions and registry
psd_optimizer/    # PyTorch optimiser package
experiments.py    # Synthetic data generation

Contributing

Contributions are welcome! Please open an issue or pull request on GitHub and see CONTRIBUTING.md for guidelines. By participating you agree to abide by the CODE_OF_CONDUCT.md.

Citation

If you use PSD in your research, please cite the following:

@misc{alpay2025escapingsaddlepointscurvaturecalibrated,
      title={Escaping Saddle Points via Curvature-Calibrated Perturbations: A Complete Analysis with Explicit Constants and Empirical Validation},
      author={Faruk Alpay and Hamdi Alakkad},
      year={2025},
      eprint={2508.16540},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2508.16540},
}

License

This project is released under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psd_optimizer-0.1.2.tar.gz (31.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

psd_optimizer-0.1.2-py3-none-any.whl (28.8 kB view details)

Uploaded Python 3

File details

Details for the file psd_optimizer-0.1.2.tar.gz.

File metadata

  • Download URL: psd_optimizer-0.1.2.tar.gz
  • Upload date:
  • Size: 31.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for psd_optimizer-0.1.2.tar.gz
Algorithm Hash digest
SHA256 1e3348c5dccc635253866e56fa231203a6f902f5e2b6f872628cf50a368b6799
MD5 f54e07b9c550bfbe4b4cdb9bc6ec9695
BLAKE2b-256 23f966c326269f9cfd3c19e8825859d147f10fccc5f8e780d5d324d79c9c6186

See more details on using hashes here.

File details

Details for the file psd_optimizer-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: psd_optimizer-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 28.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for psd_optimizer-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f1ce255190db8a143bb77b968be12c3fbbeb600402cbd20525108338901b2578
MD5 3086e1740c86f12879799739c127a9c7
BLAKE2b-256 0ec83cfceadf213ce2e09585e3f02ee754a6a226d5c12cc041a1f6c81acf48dc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page