Skip to main content

High-performance LOWESS smoothing for Python.

Project description

LOWESS Project

fastLowess lowess PyPI fastlowess (Python) R-universe npm Julia WASM C++

One LOWESS to Rule Them All
One LOWESS to Rule Them All

The fastest, most robust, and most feature-complete language-agnostic LOWESS (Locally Weighted Scatterplot Smoothing) implementation for Rust, Python, R, Julia, JavaScript, C++, and WebAssembly.

[!IMPORTANT]

The lowess-project contains a complete ecosystem for LOWESS smoothing:

LOESS vs. LOWESS

Feature LOESS (This Crate) LOWESS
Polynomial Degree Linear, Quadratic, Cubic, Quartic Linear (Degree 1)
Dimensions Multivariate (n-D support) Univariate (1-D only)
Flexibility High (Distance metrics) Standard
Complexity Higher (Matrix inversion) Lower (Weighted average/slope)

[!TIP] Note: For a LOESS implementation, use loess-project.


Documentation

[!NOTE]

📚 View the full documentation

Why this package?

Speed

The lowess project crushes the competition in terms of speed, wether in single-threaded or multi-threaded parallel execution.

Speedup relative to Python's statsmodels.lowess (higher is better):

Category statsmodels R (stats) Serial Parallel GPU
Clustered 163ms 83× 203× 433× 32×
Constant Y 134ms 92× 212× 410× 18×
Delta (large–none) 105ms 16×
Extreme Outliers 489ms 106× 201× 388× 29×
Financial (500–10K) 106ms 105× 252× 293× 12×
Fraction (0.05–0.67) 221ms 104× 228× 391× 22×
Genomic (1K–50K) 1833ms 20× 95×
High Noise 435ms 133× 134× 375× 32×
Iterations (0–10) 204ms 115× 224× 386× 18×
Scale (1K–50K) 1841ms 264× 487× 581× 98×
Scientific (500–10K) 167ms 109× 205× 314× 15×
Scale Large* (100K–2M) 1.4× 0.3×

*Scale Large benchmarks are relative to Serial (statsmodels cannot handle these sizes)

The numbers are the average across a range of scenarios for each category (e.g., Delta from none, to small, medium, and large).

Robustness

This implementation is more robust than R's lowess and Python's statsmodels due to two key design choices:

MAD-Based Scale Estimation:

For robustness weight calculations, this crate uses Median Absolute Deviation (MAD) for scale estimation:

s = median(|r_i - median(r)|)

In contrast, statsmodels and R's lowess uses the median of absolute residuals (MAR):

s = median(|r_i|)
  • MAD is a breakdown-point-optimal estimator—it remains valid even when up to 50% of data are outliers.
  • The median-centering step removes asymmetric bias from residual distributions.
  • MAD provides consistent outlier detection regardless of whether residuals are centered around zero.

Boundary Padding:

This crate applies a range of different boundary policies at dataset edges:

  • Extend: Repeats edge values to maintain local neighborhood size.
  • Reflect: Mirrors data symmetrically around boundaries.
  • Zero: Pads with zeros (useful for signal processing).
  • NoBoundary: Original Cleveland behavior

statsmodels and R's lowess do not apply boundary padding, which can lead to:

  • Biased estimates near boundaries due to asymmetric local neighborhoods.
  • Increased variance at the edges of the smoothed curve.

Features

A variety of features, supporting a range of use cases:

Feature This package statsmodels R (stats)
Kernel 7 options only Tricube only Tricube
Robustness Weighting 3 options only Huber only Huber
Scale Estimation 2 options only MAR only MAR
Boundary Padding 4 options no padding no padding
Zero Weight Fallback 3 options no no
Auto Convergence yes no no
Online Mode yes no no
Streaming Mode yes no no
Confidence Intervals yes no no
Prediction Intervals yes no no
Cross-Validation 2 options no no
Parallel Execution yes no no
GPU Acceleration yes* no no
no-std Support yes no no

* GPU acceleration is currently in beta and may not be available on all platforms.

Validation

All implementations are numerical twins of R's lowess:

Aspect Status Details
Accuracy ✅ EXACT MATCH Max diff < 1e-12 across all scenarios
Consistency ✅ PERFECT Multiple scenarios pass with strict tolerance
Robustness ✅ VERIFIED Robust smoothing matches R exactly

Installation

Currently available for R, Python, Julia, and Rust:

R (from R-universe, recommended):

install.packages("rfastlowess", repos = "https://thisisamirv.r-universe.dev")

Python (from PyPI):

pip install fastlowess

Or from conda-forge:

conda install -c conda-forge fastlowess

Rust (lowess, no_std compatible):

[dependencies]
lowess = "0.99"

Rust (fastLowess, parallel + GPU):

[dependencies]
fastLowess = { version = "0.99", features = ["cpu"] }

Julia (from Julia General Registry):

using Pkg
Pkg.add("fastLowess")

Node.js (from npm):

npm install fastlowess

WebAssembly (from npm):

npm install fastlowess-wasm

Or via CDN:

<script type="module">
  import init, { smooth } from 'https://unpkg.com/fastlowess-wasm@latest';
  await init();
</script>

C++ (build from source):

make cpp
# Links against libfastlowess_cpp.so

Quick Example

R:

library(rfastlowess)

x <- c(1, 2, 3, 4, 5)
y <- c(2.0, 4.1, 5.9, 8.2, 9.8)

model <- Lowess(fraction = 0.5, iterations = 3)
result <- model$fit(x, y)
print(result$y)

Python:

from fastlowess import Lowess
import numpy as np

x = np.array([1.0, 2.0, 3.0, 4.0, 5.0])
y = np.array([2.0, 4.1, 5.9, 8.2, 9.8])

model = Lowess(fraction=0.5, iterations=3)
result = model.fit(x, y)
print(result.y)

Rust:

use lowess::prelude::*;

let x = vec![1.0, 2.0, 3.0, 4.0, 5.0];
let y = vec![2.0, 4.1, 5.9, 8.2, 9.8];

let model = Lowess::new()
    .fraction(0.5)
    .iterations(3)
    .adapter(Batch)
    .build()?;

let result = model.fit(&x, &y)?;
println!("{}", result);

Julia:

using fastlowess

x = [1.0, 2.0, 3.0, 4.0, 5.0]
y = [2.0, 4.1, 5.9, 8.2, 9.8]

result = fit(Lowess(fraction=0.5, iterations=3), x, y)
println(result.y)

Node.js:

const { Lowess } = require('fastlowess');

const x = [1.0, 2.0, 3.0, 4.0, 5.0];
const y = [2.0, 4.1, 5.9, 8.2, 9.8];

const model = new Lowess({ fraction: 0.5, iterations: 3 });
const result = model.fit(x, y);
console.log(result.y);

WebAssembly:

import init, { smooth } from 'fastlowess-wasm';

await init();

const x = new Float64Array([1.0, 2.0, 3.0, 4.0, 5.0]);
const y = new Float64Array([2.0, 4.1, 5.9, 8.2, 9.8]);

const result = smooth(x, y, { fraction: 0.5, iterations: 3 });
console.log(result.y);

C++:

#include <fastlowess.hpp>

std::vector<double> x = {1.0, 2.0, 3.0, 4.0, 5.0};
std::vector<double> y = {2.0, 4.1, 5.9, 8.2, 9.8};

fastlowess::LowessOptions options;
options.fraction = 0.5;
options.iterations = 3;

fastlowess::Lowess model(options);
auto result = model.fit(x, y);

for (double val : result.y_vector()) std::cout << val << " ";

API Reference

R:

Lowess(
    fraction = 0.5,
    iterations = 3L,
    delta = 0.01,
    weight_function = "tricube",
    robustness_method = "bisquare",
    zero_weight_fallback = "use_local_mean",
    boundary_policy = "extend",
    confidence_intervals = 0.95,
    prediction_intervals = 0.95,
    return_diagnostics = TRUE,
    return_residuals = TRUE,
    return_robustness_weights = TRUE,
    cv_fractions = c(0.3, 0.5, 0.7),
    cv_method = "kfold",
    cv_k = 5L,
    auto_converge = 1e-4,
    parallel = TRUE
)$fit(x, y)

Python:

from fastlowess import Lowess

model = Lowess(
    fraction=0.5,
    iterations=3,
    delta=0.01,
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=0.95,
    prediction_intervals=0.95,
    return_diagnostics=True,
    return_residuals=True,
    return_robustness_weights=True,
    cv_fractions=[0.3, 0.5, 0.7],
    cv_method="kfold",
    cv_k=5,
    auto_converge=1e-4,
    parallel=True
)
result = model.fit(x, y)

Rust:

Lowess::new()
    .fraction(0.5)              // Smoothing span (0, 1]
    .iterations(3)              // Robustness iterations
    .delta(0.01)                // Interpolation threshold
    .weight_function(Tricube)   // Kernel selection
    .robustness_method(Bisquare)
    .zero_weight_fallback(UseLocalMean)
    .boundary_policy(Extend)
    .confidence_intervals(0.95)
    .prediction_intervals(0.95)
    .return_diagnostics()
    .return_residuals()
    .return_robustness_weights()
    .cross_validate(KFold(5, &[0.3, 0.5, 0.7]).seed(123))
    .auto_converge(1e-4)
    .adapter(Batch)             // or Streaming, Online
    .parallel(true)             // fastLowess only
    .backend(CPU)               // fastLowess only: CPU or GPU
    .build()?;

Julia:

Lowess(;
    fraction=0.5,
    iterations=3,
    delta=NaN,  # NaN for auto
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=NaN,
    prediction_intervals=NaN,
    return_diagnostics=true,
    return_residuals=true,
    return_robustness_weights=true,
    cv_fractions=Float64[], # e.g. [0.3, 0.5]
    cv_method="kfold",
    cv_k=5,
    auto_converge=NaN,
    parallel=true
)

Node.js:

new Lowess({
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4,
    parallel: true
}).fit(x, y)

WebAssembly:

smooth(x, y, {
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4
})

C++:

fastlowess::LowessOptions options;
options.fraction = 0.5;
options.iterations = 3;
options.delta = 0.01;
options.weight_function = "tricube";
options.robustness_method = "bisquare";
options.zero_weight_fallback = "use_local_mean";
options.boundary_policy = "extend";
options.confidence_intervals = 0.95;
options.prediction_intervals = 0.95;
options.return_diagnostics = true;
options.return_residuals = true;
options.return_robustness_weights = true;
options.cv_fractions = {0.3, 0.5, 0.7};
options.cv_method = "kfold";
options.cv_k = 5;
options.auto_converge = 1e-4;
options.parallel = true;

fastlowess::Lowess model(options);
auto result = model.fit(x, y);

Result Structure

R:

result$x, result$y, result$standard_errors
result$confidence_lower, result$confidence_upper
result$prediction_lower, result$prediction_upper
result$residuals, result$robustness_weights
result$diagnostics, result$iterations_used
result$fraction_used, result$cv_scores

Python:

result.x, result.y, result.standard_errors
result.confidence_lower, result.confidence_upper
result.prediction_lower, result.prediction_upper
result.residuals, result.robustness_weights
result.diagnostics, result.iterations_used
result.fraction_used, result.cv_scores

Rust:

pub struct LowessResult<T> {
    pub x: Vec<T>,                           // Sorted x values
    pub y: Vec<T>,                           // Smoothed y values
    pub standard_errors: Option<Vec<T>>,
    pub confidence_lower: Option<Vec<T>>,
    pub confidence_upper: Option<Vec<T>>,
    pub prediction_lower: Option<Vec<T>>,
    pub prediction_upper: Option<Vec<T>>,
    pub residuals: Option<Vec<T>>,
    pub robustness_weights: Option<Vec<T>>,
    pub diagnostics: Option<Diagnostics<T>>,
    pub iterations_used: Option<usize>,
    pub fraction_used: T,
    pub cv_scores: Option<Vec<T>>,
}

Julia:

result.x, result.y, result.standard_errors
result.confidence_lower, result.confidence_upper
result.prediction_lower, result.prediction_upper
result.residuals, result.robustness_weights
result.diagnostics, result.iterations_used
result.fraction_used

Node.js:

result.x, result.y, result.standardErrors
result.confidenceLower, result.confidenceUpper
result.predictionLower, result.predictionUpper
result.residuals, result.robustnessWeights
result.diagnostics, result.iterationsUsed
result.fractionUsed, result.cvScores

WebAssembly:

result.x, result.y, result.standardErrors
result.confidenceLower, result.confidenceUpper
result.predictionLower, result.predictionUpper
result.residuals, result.robustnessWeights
result.diagnostics, result.iterationsUsed
result.fractionUsed, result.cvScores

C++:

result.y_vector()              // std::vector<double>
result.confidence_lower()      // std::vector<double>
result.confidence_upper()      // std::vector<double>
result.prediction_lower()      // std::vector<double>
result.prediction_upper()      // std::vector<double>
result.residuals()             // std::vector<double>
result.robustness_weights()    // std::vector<double>
result.diagnostics()           // Diagnostics struct
result.iterations_used()       // size_t
result.fraction_used()         // double

Contributing

Contributions are welcome! Please see the CONTRIBUTING.md file for more information.

License

Licensed under either of:

at your option.

References

  • Cleveland, W.S. (1979). "Robust Locally Weighted Regression and Smoothing Scatterplots". JASA.
  • Cleveland, W.S. (1981). "LOWESS: A Program for Smoothing Scatterplots". The American Statistician.

Citation

If you use this software in your research, please cite it using the CITATION.cff file or the BibTeX entry below:

@software{lowess_project,
  author = {Valizadeh, Amir},
  title = {LOWESS Project: High-Performance Locally Weighted Scatterplot Smoothing},
  year = {2026},
  url = {https://github.com/thisisamirv/lowess-project},
  license = {MIT OR Apache-2.0}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastlowess-0.99.10.tar.gz (49.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

fastlowess-0.99.10-cp38-abi3-win_amd64.whl (324.6 kB view details)

Uploaded CPython 3.8+Windows x86-64

fastlowess-0.99.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (452.8 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

fastlowess-0.99.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (438.4 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

fastlowess-0.99.10-cp38-abi3-macosx_11_0_arm64.whl (401.4 kB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

fastlowess-0.99.10-cp38-abi3-macosx_10_12_x86_64.whl (426.1 kB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file fastlowess-0.99.10.tar.gz.

File metadata

  • Download URL: fastlowess-0.99.10.tar.gz
  • Upload date:
  • Size: 49.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fastlowess-0.99.10.tar.gz
Algorithm Hash digest
SHA256 a56507da16f992217624df94d99226ecac16f6592e29b8c4630d54fd1255daf9
MD5 fd2dfd69571642824f24c1bd1a58c1eb
BLAKE2b-256 c973ae133568367590ce7b88d5842556138a0559485f89cf3b40f9a371e65ea8

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10.tar.gz:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-0.99.10-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: fastlowess-0.99.10-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 324.6 kB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fastlowess-0.99.10-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 7f0cf234d219c51b4700f2ab296f8d6c3f2f728c085134ecc6222841f27bf2a2
MD5 cfa0b98f5dc207a1978ebb6a5e134149
BLAKE2b-256 5f9ae5a4817467d974df6f1bc74f3b691425b9a75d6de18eb2c8054ee294c330

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10-cp38-abi3-win_amd64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-0.99.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-0.99.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c67cd186102d17257f0bc47b46c1861dc766dc6b1161da48c5ea9881dbb248fb
MD5 731a3e8d24ee2f2c3d8739e0912c6613
BLAKE2b-256 383a5b0213632ede689dbb0daf8730dd1c1aa68332441290896d879b0b756ac2

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-0.99.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for fastlowess-0.99.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 c63e20b1abcfe5cc708d4a83d86098fb40685cc8b5be72fe55894174ae00732a
MD5 1807266117831e2f47aeeb2ddad3b9a0
BLAKE2b-256 e62ada460414d78b864c5891961d9442c7436e8a2f35fb47c67ff265fcb7b277

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-0.99.10-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for fastlowess-0.99.10-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 49acd4b7dc3b2a1388cabbf04d6eab7fe7b1a54f42e8a489def21c8c915c2423
MD5 a7cd0c4edcd1070b5840227df294db2d
BLAKE2b-256 90ca11d0499e89869d17debec36cc69c2eeb0f043f8a3c080945134e0d53184a

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10-cp38-abi3-macosx_11_0_arm64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-0.99.10-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-0.99.10-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 1238580cd21e4f04a41b5d398a53ccbb7ba0684718c50181a957f978fc68f266
MD5 1d3d83de94ecc247c307d00d7d9791a5
BLAKE2b-256 78be01f4f609f3d54df11338f19f84bda1bcc77f8b31373329e6a1675a8dd4a9

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-0.99.10-cp38-abi3-macosx_10_12_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page