Skip to main content

Whittaker-Henderson smoothing for P&C ratemaking and actuarial graduation — 1D, 2D, automatic lambda selection (GCV/REML/AIC).

Project description

tests PyPI Python 3.10+ License: MIT Status: Alpha

whsmooth

Whittaker-Henderson smoothing for P&C ratemaking and actuarial graduation --- 1D, 2D, automatic lambda selection.


What is whsmooth?

Whittaker-Henderson smoothing is a workhorse technique across actuarial practice: smoothing claim severity by exposure cell, graduating loss development factors, stabilizing rate relativities, and graduating mortality tables. The method has been used since the 1920s; it minimizes a weighted sum of fidelity to the data and roughness of the smoothed curve, controlled by a single parameter lambda.

Despite its centrality, no dedicated Python implementation exists. R has MortalitySmooth and ungroup, both life-actuarial. Python practitioners --- especially in P&C --- are left writing ad-hoc scripts.

whsmooth fills this gap with:

  • P&C-first design --- built for severity, frequency, LDF, and rate relativity smoothing, with mortality graduation as a natural special case.
  • Sparse-matrix solver --- banded penalty system solved with scipy.sparse for O(n) performance.
  • Principled lambda selection --- REML, GCV, and AIC, so you stop guessing your smoothing parameter.
  • 1D and 2D --- smooth a vector by vehicle age, or an entire age × territory grid in one call.
  • Sklearn-style API --- fit(y).fitted_, no surprises.

Installation

pip install whsmooth

# With plotting helpers
pip install "whsmooth[plot]"

# With pandas integration
pip install "whsmooth[pandas]"

Quickstart --- Smoothing claim severity by vehicle age

import numpy as np
from whsmooth import WH1D

# Observed mean severity by vehicle age (0-25), with claim counts as weights
ages = np.arange(0, 26)
severity_obs = np.array([
    8050, 8120, 8210, 8290, 8390, 8520, 8680, 8850, 9050, 9270,
    9510, 9770, 10050, 10350, 10670, 11010, 11380, 11760, 12300, 12800,
    13100, 14200, 13400, 15100, 14000, 16500,
])
counts = np.array([
    5200, 4500, 3900, 3400, 2950, 2550, 2200, 1900, 1640, 1420,
    1230, 1060, 920, 790, 685, 590, 510, 440, 380, 330,
    285, 245, 210, 180, 155, 135,
])

# Fit with GCV-selected lambda; weights = exposure
wh = WH1D(lam='gcv', order=2).fit(severity_obs, weights=counts)

print(f"Selected lambda: {wh.lambda_:.2f}")
print(f"Effective degrees of freedom: {wh.edf_:.2f}")
smoothed = wh.fitted_

The high-credibility cells (low ages, large counts) are tracked closely; the sparse tail (older vehicles) is stabilized toward the underlying trend instead of chasing noise.

Quickstart --- 2D smoothing on age × territory

from whsmooth import WH2D

# Pure premium grid: 10 age bands × 8 territories
pure_premium = ...  # shape (10, 8)
exposure = ...      # shape (10, 8)

wh2 = WH2D(lam=('gcv', 'gcv'), order=(2, 2)).fit(pure_premium, weights=exposure)
pp_smooth = wh2.fitted_

Other use cases

  • LDF smoothing: stabilize loss development factors before extrapolation.
  • Frequency relativities: smooth claim frequency by exposure class.
  • Mortality graduation (life): graduate qx by age --- same algorithm, drop in your qx_crude and exposure.
# Mortality graduation example
wh = WH1D(lam='reml', order=2).fit(qx_crude, weights=exposure)
qx_grad = wh.fitted_

Features

Module Description
WhittakerHenderson1D / WH1D 1D smoothing, any difference order, weighted
WhittakerHenderson2D / WH2D 2D smoothing on grids (Kronecker penalties)
lam='gcv' / 'reml' / 'aic' Automatic lambda selection
whsmooth.diagnostics gcv_score, reml_score, aic_score, edf
whsmooth._penalties Difference penalty matrices, composite penalties

How it works

The Whittaker-Henderson smoother solves:

min_a   sum_i w_i (y_i - a_i)^2  +  lambda * ||D_d a||^2

where D_d is the d-th order difference matrix and lambda controls the smoothness-fidelity tradeoff. The system has a banded coefficient matrix and is solved via sparse Cholesky factorization --- O(n) instead of O(n^3).

For 2D problems, the penalty extends to Kronecker products of row and column difference matrices, following Currie, Durban & Eilers (2004).

References

P&C / general

  • Werner, G. & Modlin, C. (2016). Basic Ratemaking (5th ed.). Casualty Actuarial Society.
  • Verrall, R. J. (1996). "Claims Reserving and Generalised Additive Models." Insurance: Mathematics and Economics, 19(1), 31-43.
  • Eilers, P. H. C. & Marx, B. D. (1996). "Flexible Smoothing with B-splines and Penalties." Statistical Science, 11(2), 89-121.

Original

  • Whittaker, E. T. (1922). "On a New Method of Graduation." Proc. Edinburgh Math. Soc., 41, 63-75.
  • Henderson, R. (1924). "A New Method of Graduation." Trans. Actuarial Society of America, 25, 29-40.

Life / mortality

  • Currie, I. D., Durban, M. & Eilers, P. H. C. (2004). "Smoothing and Forecasting Mortality Rates." Statistical Modelling, 4(4), 279-298.
  • Camarda, C. G. (2012). "MortalitySmooth: An R Package for Smoothing Poisson Counts with P-Splines." J. Statistical Software, 50(1), 1-24.

Part of the actuarial Python ecosystem

whsmooth is part of a 6-library P&C actuarial stack:

  • actudist --- severity & frequency distributions
  • burncost --- burning cost analysis
  • actuarcredibility --- credibility methods
  • whsmooth --- smoothing (this library)
  • pyratemaking --- rating tables (coming)
  • pyinsurancerating --- rating engine (coming)

Contributing

See CONTRIBUTING.md. PRs welcome --- open an issue first to discuss substantive changes.

Citation

If you use whsmooth in academic work, see CITATION.cff or:

López, I. (2026). whsmooth: Whittaker-Henderson smoothing for P&C ratemaking in Python (v0.1.0). https://github.com/CosmikArt/whsmooth

Author

Isaac López

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

whsmooth-0.1.0.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

whsmooth-0.1.0-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file whsmooth-0.1.0.tar.gz.

File metadata

  • Download URL: whsmooth-0.1.0.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for whsmooth-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a6e68aa2a2d97213a88af1dcd07d4135d41bc61b115a3e5a7fbcd000581702d1
MD5 4ee03a4757579f6143f03938340dbabb
BLAKE2b-256 9522868b2dce41b21b04462d748c075aca74dc53c216d54b3c0ece7bc7526c20

See more details on using hashes here.

File details

Details for the file whsmooth-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: whsmooth-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 14.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for whsmooth-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c027a8f5413dbac97680c2d895894aa88f0553bdea07992135d0d117066bb7fb
MD5 e624020983fad3b19a9a7597854920b7
BLAKE2b-256 59ece89f837d3e193bb04864164575cbff431a7e14ab9e9894e5f0ade4977c25

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page