Skip to main content

High-performance LOWESS smoothing for Python.

Project description

LOWESS Project

lowess fastLowess PyPI R-universe npm Julia WASM C++
fastlowess (Python) libfastlowess (C++) rfastlowess (R)

One LOWESS to Rule Them All
One LOWESS to Rule Them All

The fastest, most robust, and most feature-complete language-agnostic LOWESS (Locally Weighted Scatterplot Smoothing) implementation for Rust, Python, R, Julia, JavaScript, C++, and WebAssembly.

[!IMPORTANT]

The lowess-project contains a complete ecosystem for LOWESS smoothing:


Installation

[!NOTE]

Currently available for R, Python, Rust, Julia, Node.js, WebAssembly, and C++. See INSTALLATION.md for detailed installation instructions.

Documentation

[!NOTE]

📚 View the full documentation


LOESS vs. LOWESS

Feature LOESS (This Crate) LOWESS
Polynomial Degree Linear, Quadratic, Cubic, Quartic Linear (Degree 1)
Dimensions Multivariate (n-D support) Univariate (1-D only)
Flexibility High (Distance metrics) Standard
Complexity Higher (Matrix inversion) Lower (Weighted average/slope)

[!TIP] Note: For a LOESS implementation, use loess-project.


Why this package?

Speed

The lowess project beats the competition in terms of speed, whether in single-threaded or multi-threaded parallel execution. It is on average 200-327x faster than Python's statsmodels.lowess and 2-3x faster than R's lowess.

For more details on the performance comparison, see the BENCHMARKS file.

Robustness

This implementation is more robust than R's lowess and Python's statsmodels due to two key design choices:

MAD-Based Scale Estimation:

For robustness weight calculations, this crate uses Median Absolute Deviation (MAD) for scale estimation:

s = median(|r_i - median(r)|)

In contrast, statsmodels and R's lowess uses the median of absolute residuals (MAR):

s = median(|r_i|)
  • MAD is a breakdown-point-optimal estimator—it remains valid even when up to 50% of data are outliers.
  • The median-centering step removes asymmetric bias from residual distributions.
  • MAD provides consistent outlier detection regardless of whether residuals are centered around zero.

Boundary Padding:

This crate applies a range of different boundary policies at dataset edges:

  • Extend: Repeats edge values to maintain local neighborhood size.
  • Reflect: Mirrors data symmetrically around boundaries.
  • Zero: Pads with zeros (useful for signal processing).
  • NoBoundary: Original Cleveland behavior

statsmodels and R's lowess do not apply boundary padding, which can lead to:

  • Biased estimates near boundaries due to asymmetric local neighborhoods.
  • Increased variance at the edges of the smoothed curve.

Features

A variety of features, supporting a range of use cases:

Feature This package statsmodels R (stats)
Kernel 7 options only Tricube only Tricube
Robustness Weighting 3 options only Huber only Huber
Scale Estimation 2 options only MAR only MAR
Boundary Padding 4 options no padding no padding
Zero Weight Fallback 3 options no no
Auto Convergence yes no no
Online Mode yes no no
Streaming Mode yes no no
Confidence Intervals yes no no
Prediction Intervals yes no no
Cross-Validation 2 options no no
Parallel Execution yes no no
GPU Acceleration yes* no no
no-std Support yes no no

* GPU acceleration is currently in beta and may not be available on all platforms.

Validation

All implementations are numerical twins of R's lowess:

Aspect Status Details
Accuracy ✅ EXACT MATCH Max diff < 1e-12 across all scenarios
Consistency ✅ PERFECT Multiple scenarios pass with strict tolerance
Robustness ✅ VERIFIED Robust smoothing matches R exactly

API Reference

R:

Lowess(
    fraction = 0.5,
    iterations = 3L,
    delta = 0.01,
    weight_function = "tricube",
    robustness_method = "bisquare",
    zero_weight_fallback = "use_local_mean",
    boundary_policy = "extend",
    confidence_intervals = 0.95,
    prediction_intervals = 0.95,
    return_diagnostics = TRUE,
    return_residuals = TRUE,
    return_robustness_weights = TRUE,
    cv_fractions = c(0.3, 0.5, 0.7),
    cv_method = "kfold",
    cv_k = 5L,
    auto_converge = 1e-4,
    parallel = TRUE
)$fit(x, y)

# Result structure:
result$x,
result$y,
result$standard_errors,
result$confidence_lower,
result$confidence_upper,
result$prediction_lower,
result$prediction_upper,
result$residuals,
result$robustness_weights,
result$diagnostics,
result$iterations_used,
result$fraction_used,
result$cv_scores

Python:

from fastlowess import Lowess

model = Lowess(
    fraction=0.5,
    iterations=3,
    delta=0.01,
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=0.95,
    prediction_intervals=0.95,
    return_diagnostics=True,
    return_residuals=True,
    return_robustness_weights=True,
    cv_fractions=[0.3, 0.5, 0.7],
    cv_method="kfold",
    cv_k=5,
    auto_converge=1e-4,
    parallel=True
)
result = model.fit(x, y)

# Result structure:
result.x,
result.y,
result.standard_errors,
result.confidence_lower,
result.confidence_upper,
result.prediction_lower,
result.prediction_upper,
result.residuals,
result.robustness_weights,
result.diagnostics,
result.iterations_used,
result.fraction_used,
result.cv_scores

Rust:

Lowess::new()
    .fraction(0.5)
    .iterations(3)
    .delta(0.01)
    .weight_function(Tricube)
    .robustness_method(Bisquare)
    .zero_weight_fallback(UseLocalMean)
    .boundary_policy(Extend)
    .confidence_intervals(0.95)
    .prediction_intervals(0.95)
    .return_diagnostics()
    .return_residuals()
    .return_robustness_weights()
    .cross_validate(KFold(5, &[0.3, 0.5, 0.7]).seed(123))
    .auto_converge(1e-4)
    .adapter(Batch)
    .parallel(true)             // fastLowess only
    .backend(CPU)               // fastLowess only: CPU or GPU
    .build()?;

let result = model.fit(x, y);

// Result structure:
pub struct LowessResult<T> {
    pub x: Vec<T>,                           // Sorted x values
    pub y: Vec<T>,                           // Smoothed y values
    pub standard_errors: Option<Vec<T>>,
    pub confidence_lower: Option<Vec<T>>,
    pub confidence_upper: Option<Vec<T>>,
    pub prediction_lower: Option<Vec<T>>,
    pub prediction_upper: Option<Vec<T>>,
    pub residuals: Option<Vec<T>>,
    pub robustness_weights: Option<Vec<T>>,
    pub diagnostics: Option<Diagnostics<T>>,
    pub iterations_used: Option<usize>,
    pub fraction_used: T,
    pub cv_scores: Option<Vec<T>>,
}

Julia:

Lowess(;
    fraction=0.5,
    iterations=3,
    delta=NaN,  # NaN for auto
    weight_function="tricube",
    robustness_method="bisquare",
    zero_weight_fallback="use_local_mean",
    boundary_policy="extend",
    confidence_intervals=NaN,
    prediction_intervals=NaN,
    return_diagnostics=true,
    return_residuals=true,
    return_robustness_weights=true,
    cv_fractions=Float64[], # e.g. [0.3, 0.5]
    cv_method="kfold",
    cv_k=5,
    auto_converge=NaN,
    parallel=true
)

# Result structure:
result.x,
result.y,
result.standard_errors,
result.confidence_lower,
result.confidence_upper,
result.prediction_lower,
result.prediction_upper,
result.residuals,
result.robustness_weights,
result.diagnostics,
result.iterations_used,
result.fraction_used,
result.cv_scores

Node.js:

new Lowess({
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4,
    parallel: true
}).fit(x, y)

// Result structure:
result.x,
result.y,
result.standardErrors,
result.confidenceLower,
result.confidenceUpper,
result.predictionLower,
result.predictionUpper,
result.residuals,
result.robustnessWeights,
result.diagnostics,
result.iterationsUsed,
result.fractionUsed,
result.cvScores

WebAssembly:

smooth(x, y, {
    fraction: 0.5,
    iterations: 3,
    delta: 0.01,
    weightFunction: "tricube",
    robustnessMethod: "bisquare",
    zeroWeightFallback: "use_local_mean",
    boundaryPolicy: "extend",
    confidenceIntervals: 0.95,
    predictionIntervals: 0.95,
    returnDiagnostics: true,
    returnResiduals: true,
    returnRobustnessWeights: true,
    cvFractions: [0.3, 0.5, 0.7],
    cvMethod: "kfold",
    cvK: 5,
    autoConverge: 1e-4,
    parallel: true
})

// Result structure:
result.x,
result.y,
result.standardErrors,
result.confidenceLower,
result.confidenceUpper,
result.predictionLower,
result.predictionUpper,
result.residuals,
result.robustnessWeights,
result.diagnostics,
result.iterationsUsed,
result.fractionUsed,
result.cvScores

C++:

fastlowess::LowessOptions options;
options.fraction = 0.5;
options.iterations = 3;
options.delta = 0.01;
options.weight_function = "tricube";
options.robustness_method = "bisquare";
options.zero_weight_fallback = "use_local_mean";
options.boundary_policy = "extend";
options.confidence_intervals = 0.95;
options.prediction_intervals = 0.95;
options.return_diagnostics = true;
options.return_residuals = true;
options.return_robustness_weights = true;
options.cv_fractions = {0.3, 0.5, 0.7};
options.cv_method = "kfold";
options.cv_k = 5;
options.auto_converge = 1e-4;
options.parallel = true;

fastlowess::Lowess model(options);
auto result = model.fit(x, y);

// Result structure:
result.x_vector(),
result.y_vector(),
result.standard_errors(),
result.confidence_lower(),
result.confidence_upper(),
result.prediction_lower(),
result.prediction_upper(),
result.residuals(),
result.robustness_weights(),
result.diagnostics(),
result.iterations_used(),
result.fraction_used(),
result.cv_scores()

Contributing

Contributions are welcome! Please see the CONTRIBUTING.md file for more information.

License

Licensed under MIT or Apache-2.0.

Citation

If you use this software in your research, please cite it using the CITATION.cff file or the BibTeX entry below:

@software{lowess_project,
  author = {Valizadeh, Amir},
  title = {LOWESS Project: High-Performance Locally Weighted Scatterplot Smoothing},
  year = {2026},
  url = {https://github.com/thisisamirv/lowess-project},
  license = {MIT OR Apache-2.0}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastlowess-1.0.0.tar.gz (39.5 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

fastlowess-1.0.0-cp38-abi3-win_amd64.whl (329.6 kB view details)

Uploaded CPython 3.8+Windows x86-64

fastlowess-1.0.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (456.5 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ x86-64

fastlowess-1.0.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (439.7 kB view details)

Uploaded CPython 3.8+manylinux: glibc 2.17+ ARM64

fastlowess-1.0.0-cp38-abi3-macosx_11_0_arm64.whl (406.8 kB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

fastlowess-1.0.0-cp38-abi3-macosx_10_12_x86_64.whl (428.5 kB view details)

Uploaded CPython 3.8+macOS 10.12+ x86-64

File details

Details for the file fastlowess-1.0.0.tar.gz.

File metadata

  • Download URL: fastlowess-1.0.0.tar.gz
  • Upload date:
  • Size: 39.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fastlowess-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6ee95afe8d4de2a8afbc4a218708d7f9f36b6483743a7f705116d1a0306c610e
MD5 1705ca1085c649ba0f8847ab8897ddf1
BLAKE2b-256 81b21a51c76f0bbbf3d8625579ae03ca5e03f76d473efa135f5d39a697ca5136

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0.tar.gz:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.0.0-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: fastlowess-1.0.0-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 329.6 kB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for fastlowess-1.0.0-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 e3d49e2b8658d7ca2f44145c286a5b1a3f9a6ed33e85ce1609cd55f9373d6797
MD5 ab6854bc4912fe70585c4ff1f277ce5f
BLAKE2b-256 4440b1bd030789772287eee5e9732c67f4d4fe30189b9c2b867579a6908a2586

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0-cp38-abi3-win_amd64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.0.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-1.0.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 532feac2656a1de6809622b972b10ea58ae4347220c92648fd15fc08a39acf2f
MD5 f11a6aa180ad27ff66f4c7da79932372
BLAKE2b-256 acaed77b5373967b7bc5b6ef1039f624171def3de11836e0adde74e5e9b50988

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.0.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for fastlowess-1.0.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 5159f62a71a1a46e8a3e20ba398288e4d2ae4e0ca2a736ad6f7ebde5595561e0
MD5 dcd3bdbad8e111cf596eb4177a4e59e3
BLAKE2b-256 8baba165026aee77da8775630f21f149cd6a69cc1c7234d7b4ce21f57534ed4a

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.0.0-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for fastlowess-1.0.0-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2d44eedc124550efcc05a3ca2ddbc74dc1a180f054f7f837dd6030b6e288c32a
MD5 3253b74627f6922db17169b50c11ac7b
BLAKE2b-256 05f6e68db456fc1059083ad29401ab9b9d4fc14dda25a5b9807f2b63ee36db95

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0-cp38-abi3-macosx_11_0_arm64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file fastlowess-1.0.0-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for fastlowess-1.0.0-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 400d37d9c396a82c54ebfd3fdcaa4c7b7998b7ab3d86eeecce96fcc336b1ebe6
MD5 ca9fabbaa5c491ef17107018dd40685a
BLAKE2b-256 dc0a88c118317125cc5ddee39b87c0c8236b22b6711c182923e5bf018ae005a9

See more details on using hashes here.

Provenance

The following attestation bundles were made for fastlowess-1.0.0-cp38-abi3-macosx_10_12_x86_64.whl:

Publisher: release-pypi.yml on thisisamirv/lowess-project

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page