Skip to main content

Specular differentiation in normed vector spaces and its applications

Project description

Specular Differentiation

PyPI version Python 3.11 DOI License CodeFactor CodeQL Advanced Docs

The Python package specular implements specular differentiation which generalizes classical differentiation. This implementation strictly follows the definitions, notations, and results in [1] and [2].

A specular derivative (the red line) can be understood as the average of the inclination angles of the right and left derivatives. In contrast, a symmetric derivative (the purple line) is the average of the right and left derivatives. Their difference is illustrated as in the following figure.

specular-derivative-animation

Table of Contents

Installation

Requirements

specular-differentiation requires:

  • Python >= 3.11
  • ipython >= 8.12.3
  • matplotlib >= 3.10.8
  • numpy >= 2.4.0
  • pandas >= 2.3.3
  • tqdm >= 4.67.1

User installation

Standard Installation (NumPy backend)

pip install specular-differentiation

Advanced Installation (JAX backend)

pip install "specular-differentiation[jax]"

See the documentation for advanced installation (JAX backend, Pytest).

Quick start

The following simple example calculates the specular derivative of the ReLU function $f(x) = max(0, x)$ at the origin.

import specular

ReLU = lambda x: max(x, 0)
print(specular.derivative(ReLU, x=0))
0.41421356237309515

Applications

Specular differentiation is defined in normed vector spaces, allowing for applications in higher-dimensional Euclidean spaces. The specular package includes the following applications.

Ordinary differential equation

  • Directory: examples/ode/
  • References: [1], [3], [4]

In [1], seven schemes are proposed for solving ODEs numerically:

  • the specular Euler scheme of Type 1~6
  • the specular trigonometric scheme

The following example shows that the specular Euler schemes of Type 5 and 6 yield more accurate numerical solutions than classical schemes: the explicit and implicit Euler schemes and the Crank-Nicolson scheme.

ODE-example

Optimization

  • Directory: examples/optimization/
  • References: [2], [5]

In [2], three methods are proposed for optimizing nonsmooth convex objective functions:

  • the specular gradient (SPEG) method
  • the stochastic specular gradient (S-SPEG) method
  • the hybrid specular gradient (H-SPEG) method

The following example compares the three proposed methods with the classical methods: gradient descent (GD), Adaptive Moment Estimation (Adam), and Broyden-Fletcher-Goldfarb-Shanno (BFGS).

optimization-example

Documentation

Getting Started

API Reference

Examples

LaTeX macro

To use the specular differentiation symbol in your LaTeX document, add the following code to your preamble (before \begin{document}):

% Required packages
\usepackage{graphicx}
\usepackage{bm}

% Definition of Specular Differentiation symbol
\newcommand\sd[1][.5]{\mathbin{\vcenter{\hbox{\scalebox{#1}{\,$\bm{\wedge}$}}}}}

Usage examples

Use the symbol in your document (after \begin{document}):

% A specular derivative in the one-dimensional Euclidean space
$f^{\sd}(x)$

% A specular directional derivative in normed vector spaces
$\partial^{\sd}_v f(x)$

Citing specular-differentiation

To cite this repository:

@software{Jung_specular-differentiation_2026,
  author = {Jung, Kiyuob},
  doi = {10.5281/zenodo.18246734},
  license = {MIT},
  month = jan,
  title = {{specular-differentiation}},
  url = {https://github.com/kyjung2357/specular-differentiation},
  version = {1.0.0},
  year = {2026},
}

References

[1] K. Jung. Nonlinear numerical schemes using specular differentiation for initial value problems of first-order ordinary differential equations. arXiv preprint arXiv:2601.09900, 2026.

[2] K. Jung. Specular differentiation in normed vector spaces and its applications to nonsmooth convex optimization. arXiv preprint arXiv:2601.10950, 2026.

[3] K. Jung and J. Oh. The specular derivative. arXiv preprint arXiv:2210.06062, 2022.

[4] K. Jung and J. Oh. The wave equation with specular derivatives. arXiv preprint arXiv:2210.06933, 2022.

[5] K. Jung and J. Oh. Nonsmooth convex optimization using the specular gradient method with root-linear convergence. arXiv preprint arXiv:2210.06933, 2024.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

specular_differentiation-1.0.6.tar.gz (23.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

specular_differentiation-1.0.6-py3-none-any.whl (24.2 kB view details)

Uploaded Python 3

File details

Details for the file specular_differentiation-1.0.6.tar.gz.

File metadata

  • Download URL: specular_differentiation-1.0.6.tar.gz
  • Upload date:
  • Size: 23.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for specular_differentiation-1.0.6.tar.gz
Algorithm Hash digest
SHA256 c0f2e05daab49d72ad80442e2276e43466c54e03608874d3471ad1844ed4fb6d
MD5 f8e1c70112db344ca9230decdd36cd7a
BLAKE2b-256 e6d566d1ab21a3d87c47e370be0e13ebb77d3da90c59a51ee4d7176a2ecd762c

See more details on using hashes here.

File details

Details for the file specular_differentiation-1.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for specular_differentiation-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 43cce75558d08d755610123d4daf963a5b3cea91ead66a8aba0bcb75703fad02
MD5 2fd7663fce11e8c96a299686573b5da5
BLAKE2b-256 3101c2d43f59858a82193f48643f4be9e1a3983cf16d364fff0179894e25f3c5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page