Skip to main content

High-performance LOWESS smoothing for Python.

Project description

LOWESS Project

lowess fastLowess PyPI R-universe npm Julia WASM C++
fastlowess (Python) libfastlowess (C++) rfastlowess (R)
CI

One LOWESS to Rule Them All
One LOWESS to Rule Them All

The fastest, most robust, and most feature-complete language-agnostic LOWESS (Locally Weighted Scatterplot Smoothing) implementation for Rust, Python, R, Julia, JavaScript, C++, and WebAssembly.

[!IMPORTANT]

The lowess-project contains a complete ecosystem for LOWESS smoothing:


Installation

[!NOTE]

Currently available for R, Python, Rust, Julia, Node.js, WebAssembly, and C++. See INSTALLATION.md for detailed installation instructions.

Documentation

[!NOTE]

📚 View the full documentation


LOESS vs. LOWESS

Feature LOESS (This Crate) LOWESS
Polynomial Degree Linear, Quadratic, Cubic, Quartic Linear (Degree 1)
Dimensions Multivariate (n-D support) Univariate (1-D only)
Flexibility High (Distance metrics) Standard
Complexity Higher (Matrix inversion) Lower (Weighted average/slope)

[!TIP] Note: For a LOESS implementation, use loess-project.


Why this package?

Speed

The lowess project beats the competition in terms of speed, whether in single-threaded or multi-threaded parallel execution. It is on average 200-327x faster than Python's statsmodels.lowess and 2-3x faster than R's lowess.

For more details on the performance comparison, see the BENCHMARKS file.

Robustness

This implementation is more robust than R's lowess and Python's statsmodels due to two key design choices:

MAD-Based Scale Estimation:

For robustness weight calculations, this crate uses Median Absolute Deviation (MAD) for scale estimation:

s = median(|r_i - median(r)|)

In contrast, statsmodels and R's lowess uses the median of absolute residuals (MAR):

s = median(|r_i|)
  • MAD is a breakdown-point-optimal estimator—it remains valid even when up to 50% of data are outliers.
  • The median-centering step removes asymmetric bias from residual distributions.
  • MAD provides consistent outlier detection regardless of whether residuals are centered around zero.

Boundary Padding:

This crate applies a range of different boundary policies at dataset edges:

  • Extend: Repeats edge values to maintain local neighborhood size.
  • Reflect: Mirrors data symmetrically around boundaries.
  • Zero: Pads with zeros (useful for signal processing).
  • NoBoundary: Original Cleveland behavior

statsmodels and R's lowess do not apply boundary padding, which can lead to:

  • Biased estimates near boundaries due to asymmetric local neighborhoods.
  • Increased variance at the edges of the smoothed curve.

Features

A variety of features, supporting a range of use cases:

Feature This package statsmodels R (stats)
Kernel 7 options only Tricube only Tricube
Robustness Weighting 3 options only Huber only Huber
Scale Estimation 2 options only MAR only MAR
Boundary Padding 4 options no padding no padding
Zero Weight Fallback 3 options no no
Auto Convergence yes no no
Online Mode yes no no
Streaming Mode yes no no
Confidence Intervals yes no no
Prediction Intervals yes no no
Cross-Validation 2 options no no
Parallel Execution yes no no
GPU Acceleration yes* no no
no-std Support yes no no

* GPU acceleration is currently in beta and may not be available on all platforms.

Validation

All implementations are numerical twins of R's lowess:

Aspect Status Details
Accuracy ✅ EXACT MATCH Max diff < 1e-12 across all scenarios
Consistency ✅ PERFECT Multiple scenarios pass with strict tolerance
Robustness ✅ VERIFIED Robust smoothing matches R exactly

API Reference

R:

Lowess(
    fraction = 0.5,
    iterations = 3L,
    delta = 0.01,
    weight_function = "tricube",
    robustness_method = "bisquare",
    zero_weight_fallback = "use_local_mean",
    boundary_policy = "extend",
    confidence_intervals = 0.95,
    prediction_intervals = 0.95,
    return_diagnostics = TRUE,
    return_residuals = TRUE,
    return_robustness_weights = TRUE,
    cv_fractions = c(0.3, 0.5, 0.7),
    cv_method = "kfold",
    cv_k = 5L,
    auto_converge = 1e-4,
    parallel = TRUE
)$fit(x, y)

# Result structure:
result$x,
result$y,
result$standard_errors,
result$confidence_lower,
result$confidence_upper,
result$prediction_lower,
result$prediction_upper,
result$residuals,
result$robustness_weights,
result$diagnostics,
result$iterations_used,
result$fraction_used,
result$cv_scores

Python:

from fastlowess import Lowess

model = Lowess(
    fraction=0.5,
    iterations=3,
    delta=0.01,
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=0.95,
    prediction_intervals=0.95,
    return_diagnostics=True,
    return_residuals=True,
    return_robustness_weights=True,
    cv_fractions=[0.3, 0.5, 0.7],
    cv_method="kfold",
    cv_k=5,
    auto_converge=1e-4,
    parallel=True
)
result = model.fit(x, y)

# Result structure:
result.x,
result.y,
result.standard_errors,
result.confidence_lower,
result.confidence_upper,
result.prediction_lower,
result.prediction_upper,
result.residuals,
result.robustness_weights,
result.diagnostics,
result.iterations_used,
result.fraction_used,
result.cv_scores

Rust:

Lowess::new()
    .fraction(0.5)
    .iterations(3)
    .delta(0.01)
    .weight_function(Tricube)
    .robustness_method(Bisquare)
    .zero_weight_fallback(UseLocalMean)
    .boundary_policy(Extend)
    .confidence_intervals(0.95)
    .prediction_intervals(0.95)
    .return_diagnostics()
    .return_residuals()
    .return_robustness_weights()
    .cross_validate(KFold(5, &[0.3, 0.5, 0.7]).seed(123))
    .auto_converge(1e-4)
    .adapter(Batch)
    .parallel(true)             // fastLowess only
    .backend(CPU)               // fastLowess only: CPU or GPU
    .build()?;

let result = model.fit(x, y);

// Result structure:
pub struct LowessResult<T> {
    pub x: Vec<T>,                           // Sorted x values
    pub y: Vec<T>,                           // Smoothed y values
    pub standard_errors: Option<Vec<T>>,
    pub confidence_lower: Option<Vec<T>>,
    pub confidence_upper: Option<Vec<T>>,
    pub prediction_lower: Option<Vec<T>>,
    pub prediction_upper: Option<Vec<T>>,
    pub residuals: Option<Vec<T>>,
    pub robustness_weights: Option<Vec<T>>,
    pub diagnostics: Option<Diagnostics<T>>,
    pub iterations_used: Option<usize>,
    pub fraction_used: T,
    pub cv_scores: Option<Vec<T>>,
}

Julia:

Lowess(;
    fraction=0.5,
    iterations=3,
    delta=NaN,  # NaN for auto
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=NaN,
    prediction_intervals=NaN,
    return_diagnostics=true,
    return_residuals=true,
    return_robustness_weights=true,
    cv_fractions=Float64[], # e.g. [0.3, 0.5]
    cv_method="kfold",
    cv_k=5,
    auto_converge=NaN,
    parallel=true
)

# Result structure:
result.x,
result.y,
result.standard_errors,
result.confidence_lower,
result.confidence_upper,
result.prediction_lower,
result.prediction_upper,
result.residuals,
result.robustness_weights,
result.diagnostics,
result.iterations_used,
result.fraction_used,
result.cv_scores

Node.js:

new Lowess({
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4,
    parallel: true
}).fit(x, y)

// Result structure:
result.x,
result.y,
result.standardErrors,
result.confidenceLower,
result.confidenceUpper,
result.predictionLower,
result.predictionUpper,
result.residuals,
result.robustnessWeights,
result.diagnostics,
result.iterationsUsed,
result.fractionUsed,
result.cvScores

WebAssembly:

smooth(x, y, {
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4,
    parallel: true
})

// Result structure:
result.x,
result.y,
result.standardErrors,
result.confidenceLower,
result.confidenceUpper,
result.predictionLower,
result.predictionUpper,
result.residuals,
result.robustnessWeights,
result.diagnostics,
result.iterationsUsed,
result.fractionUsed,
result.cvScores

C++:

fastlowess::LowessOptions options;
options.fraction = 0.5;
options.iterations = 3;
options.delta = 0.01;
options.weight_function = "tricube";
options.robustness_method = "bisquare";
options.zero_weight_fallback = "use_local_mean";
options.boundary_policy = "extend";
options.confidence_intervals = 0.95;
options.prediction_intervals = 0.95;
options.return_diagnostics = true;
options.return_residuals = true;
options.return_robustness_weights = true;
options.cv_fractions = {0.3, 0.5, 0.7};
options.cv_method = "kfold";
options.cv_k = 5;
options.auto_converge = 1e-4;
options.parallel = true;

fastlowess::Lowess model(options);
auto result = model.fit(x, y);

// Result structure:
result.xVector(),
result.yVector(),
result.standardErrors(),
result.confidenceLower(),
result.confidenceUpper(),
result.predictionLower(),
result.predictionUpper(),
result.residuals(),
result.robustnessWeights(),
result.diagnostics(),
result.iterationsUsed(),
result.fractionUsed(),
result.cv_scores()

Contributing

Contributions are welcome! Please see the CONTRIBUTING.md file for more information.

License

Licensed under MIT or Apache-2.0.

Citation

If you use this software in your research, please cite it using the CITATION.cff file or the BibTeX entry below:

@software{lowess_project,
  author = {Valizadeh, Amir},
  title = {LOWESS Project: High-Performance Locally Weighted Scatterplot Smoothing},
  year = {2026},
  url = {https://github.com/thisisamirv/lowess-project},
  license = {MIT OR Apache-2.0}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastlowess-1.3.0.tar.gz (39.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

fastlowess-1.3.0-cp38-abi3-win_amd64.whl (330.2 kB view details)

Uploaded CPython 3.8+Windows x86-64

fastlowess-1.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (457.0 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

fastlowess-1.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (441.7 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

fastlowess-1.3.0-cp38-abi3-macosx_11_0_arm64.whl (406.1 kB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

fastlowess-1.3.0-cp38-abi3-macosx_10_12_x86_64.whl (428.6 kB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file fastlowess-1.3.0.tar.gz.

File metadata

  • Download URL: fastlowess-1.3.0.tar.gz
  • Upload date:
  • Size: 39.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for fastlowess-1.3.0.tar.gz
Algorithm Hash digest
SHA256 8e4406ddb491a83e970fa7b4b2181d332738fb2bdad06f41413cc1d891478746
MD5 27e969b6113867fe0892ffb394b47419
BLAKE2b-256 ec0da10f959a88893cd57eeaf85fbca86646263ea87d91250573c57b50475dce

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0.tar.gz:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.3.0-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: fastlowess-1.3.0-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 330.2 kB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for fastlowess-1.3.0-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 100a8912b634b887e575298649e14e1628f27e75cd1ee9d7e0c572c99534b7bc
MD5 663bfba9f625260a584660aecdfd67c4
BLAKE2b-256 872443896eab619bf6d2a8711b82d8500ce77e55f966fd18667d54b896c50f7c

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0-cp38-abi3-win_amd64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-1.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4f231506ecba5189ec55102a3c02a93f70bece156cfc37539f20f697e6ffe820
MD5 31a28ef056eb204dceb60ed49c1997c6
BLAKE2b-256 210c656b08c875aa96e9f646189c0a20b0461e2b47e63c660afb7100093fb313

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for fastlowess-1.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 5b106f8b68d537b9d1885af77ad18934c5d5facaf4bb20be67e65f85b1cd9efe
MD5 928cd07d6c716d162bea8ebe78560d7e
BLAKE2b-256 e2097004b28c2de02730d960f056cf162272060cd7ab160dbe3db29b2c7d057a

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.3.0-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for fastlowess-1.3.0-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 784285ccb3f2d39551beab1a0038718aa92cf6e33b838c4464db0d1cea0ac170
MD5 acdbf20c3a878ca5b6f866cf13cff61d
BLAKE2b-256 03322d78af7d14aa6edf0cda76bbabdcaac2b737a1bce5df791b700571642872

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0-cp38-abi3-macosx_11_0_arm64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.3.0-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-1.3.0-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 efdbdd714025180c456e5def90e0d9e516e732394a8af1a976a67a3afd14e28f
MD5 75faccb5f0f51622aa4b90bc3b84b477
BLAKE2b-256 a577474c5fce4a2d4326854387c1d6693ef6130ca054f54d5b9ea945c033a639

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.3.0-cp38-abi3-macosx_10_12_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page