Skip to main content

High-performance insurance price optimisation via Lagrangian dual decomposition

Project description

Price Contour

High-performance insurance price optimisation via Lagrangian dual decomposition.


Python 3.10+ Rust Polars AGPL-3.0


Price Contour finds optimal price scenario values across a portfolio of insurance risks subject to business constraints. Give it a scored dataset with objective and constraint values at discrete price points, and it returns the scenario value per quote that maximises your objective while respecting every constraint.

The core algorithm is Lagrangian dual decomposition, implemented in Rust for speed and exposed to Python via zero-copy Polars DataFrames. A portfolio of 1M+ risks solves in seconds.


Quick start

uv add price-contour
import polars as pl
import price_contour as pc

# Long-format DataFrame: one row per (quote, price_scenario)
# with pre-computed objective and constraint values
df = pl.read_parquet("scored_quotes.parquet")

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},  # retain at least 90% of baseline volume
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value="scenario_value",
)

result = optimiser.solve(df)

print(result.converged)        # True
print(result.iterations)       # 23
print(result.lambdas)          # {'volume': 0.147}
print(result.total_objective)  # 1_284_302.5

# Per-quote optimal scenario values as a Polars DataFrame
out = result.dataframe
print(out.head())
# ┌──────────┬──────────────┬────────────────────┬─────────────────────┬──────────────────┐
# │ quote_id │ optimal_step │ optimal_scenario_value │ optimal_income      │ optimal_volume   │
# ╞══════════╪══════════════╪════════════════════╪═════════════════════╪══════════════════╡
# │ Q001     │ 14           │ 1.07               │ 42.30               │ 0.82             │
# │ Q002     │ 11           │ 0.98               │ 18.55               │ 0.91             │
# └──────────┴──────────────┴────────────────────┴─────────────────────┴──────────────────┘

What it does

Price Contour operates on pre-computed scenario data. It does not fit models or generate demand curves. Upstream, your pricing pipeline scores every quote at a grid of price scenario values (e.g. 0.8, 0.85, 0.9, ..., 1.2) and computes what the expected income, volume, loss ratio, etc. would be at each point. Price Contour then selects the optimal scenario value per quote across the portfolio.

The input is a long-format Polars DataFrame:

quote_id scenario_index scenario_value income volume loss_ratio
Q001 0 0.80 85.2 0.95 0.62
Q001 1 0.90 92.1 0.88 0.59
Q001 2 1.00 100.0 0.80 0.60
Q002 0 0.80 42.0 0.97 0.58
... ... ... ... ... ...

The output is one optimal scenario value per quote, chosen to maximise portfolio-level income while keeping portfolio-level volume above 90% of baseline (or whatever constraints you set).


Three optimisation modes

Online optimisation

Find the optimal scenario value per individual quote. Each quote independently picks its best price point, coordinated by shared Lagrange multipliers that enforce portfolio-level constraints.

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},
)
result = optimiser.solve(df)

Ratebook optimisation

Find optimal rating factors across rating dimensions. Instead of individual scenario values, find the best factor value for each level of each rating factor (e.g. age band, region, vehicle power), applied uniformly to all quotes sharing that level.

optimiser = pc.RatebookOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},
    factor_columns=[["age_band"], ["region"], ["vehicle_power"]],
)

result = optimiser.solve(df, factors=factor_df)

print(result.factor_tables)
# {'age_band': {'18-25': 1.15, '26-35': 1.02, '36-50': 0.95, '51+': 0.98},
#  'region': {'London': 1.08, 'South East': 1.01, 'North': 0.93},
#  'vehicle_power': {'Low': 0.97, 'Medium': 1.0, 'High': 1.06}}

# Save to disk
result.save("parameters/")

# Convert to rating-step DataFrames
tables = result.to_rating_entries()

Live scoring with stored lambdas

Apply pre-computed Lagrange multipliers to new quotes in a single forward pass, with no iteration. Use this in production to score individual quotes using lambdas learned from a batch solve.

# Batch solve (offline)
result = optimiser.solve(df_portfolio)
lambdas = result.lambdas

# Live scoring (per-quote, no iteration)
applier = pc.ApplyOptimiser(
    lambdas=lambdas,
    objective="income",
    constraints={"volume": {"min": 0.90}},
)
applier.save("config/applier.json")

# Later, in production:
applier = pc.ApplyOptimiser.load("config/applier.json")
live_result = applier.apply(df_single_quote)
optimal_scenario_value = live_result.dataframe["optimal_scenario_value"][0]

Efficient frontier

Sweep constraint thresholds to generate the Pareto frontier - the trade-off curve between your objective and constraints. Each point on the frontier is a full portfolio solve at a different constraint target.

frontier = optimiser.frontier(
    df,
    threshold_ranges={"volume": (0.85, 1.0)},
    n_points_per_dim=20,
)

# DataFrame with one row per frontier point
print(frontier.points)
# ┌──────────────────┬─────────────────┬──────────────┬───────────────┬────────────┬───────────┬─────────┬─────────────────┐
# │ threshold_volume │ total_objective │ total_volume │ lambda_volume │ iterations │ converged │ sv_mean │ sv_pct_increase │
# ╞══════════════════╪═════════════════╪══════════════╪═══════════════╪════════════╪═══════════╪═════════╪═════════════════╡
# │ 0.85             │ 1_350_102       │ 0.851        │ 0.089         │ 18         │ true      │ 1.04    │ 0.62            │
# │ 0.86             │ 1_342_891       │ 0.861        │ 0.102         │ 21         │ true      │ 1.03    │ 0.58            │
# │ ...              │ ...             │ ...          │ ...           │ ...        │ ...       │ ...     │ ...             │
# └──────────────────┴─────────────────┴──────────────┴───────────────┴────────────┴───────────┴─────────┴─────────────────┘

Adjacent points are warm-started from each other (nearest-neighbour traversal of the threshold grid), so the full frontier solves much faster than running each point independently. Each point also includes scenario value distribution statistics (sv_mean, sv_std, percentiles, sv_pct_increase/sv_pct_decrease).


Constraint format

Constraints are specified as a dictionary. Keys are column names in your DataFrame, values specify the direction and threshold relative to the baseline (the portfolio totals at scenario_value = 1.0):

constraints = {
    "volume": {"min": 0.90},            # portfolio volume >= 90% of baseline
    "loss_ratio": {"max": 1.05},        # portfolio loss ratio <= 105% of baseline
    "premium": {"min_abs": 1_000_000},  # absolute: portfolio premium >= 1M
}

Direct Parquet loading

For large datasets, build the internal grid directly from a Parquet file without materialising a DataFrame in Python memory:

grid = pc.build_grid_from_parquet(
    "scored_quotes.parquet",
    constraint_columns=["volume", "loss_ratio"],
    objective="income",
)
result = optimiser.solve(grid)

Incremental grid building

For large datasets that don't fit in memory at once, build the internal grid incrementally:

builder = pc.QuoteGridBuilder(
    ["volume", "loss_ratio"],
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value_col="scenario_value",
    objective="income",
)

for chunk in data_source.iter_chunks(100_000):
    builder.append(chunk)

grid = builder.build()
result = optimiser.solve(grid)

MLflow integration

Both OnlineOptimiser and RatebookOptimiser produce MLflow-ready summaries:

result = optimiser.solve(df)
summary = optimiser.summary(result)

import mlflow
mlflow.log_params(summary["params"])
mlflow.log_metrics(summary["metrics"])
mlflow.log_dict(summary["artifacts"]["lambdas"], "lambdas.json")
mlflow.log_dict(summary["artifacts"]["config"], "config.json")

How it works

The algorithm

Price Contour solves the constrained optimisation problem:

Maximise    sum_i  objective(quote_i, scenario_value_i)
Subject to  sum_i  constraint_k(quote_i, scenario_value_i) >= threshold_k   for all k
            scenario_value_i in {discrete grid}

This is a combinatorial problem (each quote picks from M discrete scenario values). Lagrangian dual decomposition relaxes the coupling constraints into the objective using dual variables (lambdas), decomposing it into N independent per-quote subproblems:

For fixed lambdas:
    Each quote picks:  argmax_m [ objective(i, m) + sum_k lambda_k * constraint_k(i, m) ]

These are independent and embarrassingly parallel.

The outer loop updates lambdas via the subgradient method with adaptive step sizes, iterating until all constraints are satisfied and lambdas converge.

Performance

The Rust core uses:

  • Quote-major memory layout - each quote's M scenario values are contiguous, optimising the per-quote argmax inner loop for cache locality
  • Rayon parallelism - the argmax across quotes is parallelised within chunks of 4096 quotes
  • Chunked processing - large portfolios are processed in chunks (default 500K quotes) to bound memory usage
  • Adaptive step scaling - per-constraint scale factors normalise for differing magnitudes, so the algorithm works equally well for constraints ranging from 0.1 to 1,000,000
  • Lambda averaging - smooths the oscillations inherent in discrete Lagrangian relaxation where all quotes can flip simultaneously

Ratebook mode

For ratebook optimisation, coordinate descent iterates over rating factors. For each factor, a grouped Lagrangian solve finds the best discrete factor value per group (e.g. per age band), with the individual quote scenario value computed as the product of all factor values times a per-quote residual. The inner grouped solve uses the same Lagrangian machinery with remapping to the nearest grid point.


Architecture

price-contour/
├── crates/
│   ├── price-contour-core/        # Pure Rust: algorithms, data structures, solver
│   │   └── src/
│   │       ├── data.rs            # QuoteGrid, SolverConfig, SolveResult, GroupMapping
│   │       ├── solver/
│   │       │   ├── online.rs      # Lagrangian dual decomposition
│   │       │   ├── grouped.rs     # Grouped solve (ratebook inner loop)
│   │       │   ├── argmax.rs      # Per-quote Lagrangian argmax (parallel)
│   │       │   ├── lambda.rs      # Subgradient lambda updates
│   │       │   └── apply.rs       # Fixed-lambda forward pass
│   │       ├── frontier.rs        # Efficient frontier sweeping
│   │       ├── constants.rs       # Solver defaults
│   │       └── error.rs           # Error types
│   └── price-contour/             # PyO3 bindings (thin wrappers)
│       └── src/
│           ├── solver_py.rs       # DataFrame ingestion + solve
│           ├── grouped_py.rs      # Grouped solve bindings
│           ├── apply_py.rs        # Apply bindings
│           ├── frontier_py.rs     # Frontier bindings
│           ├── builder_py.rs      # QuoteGridBuilder bindings
│           ├── grid_py.rs         # QuoteGrid bindings
│           └── parquet_grid_py.rs # Parquet → QuoteGrid loader
├── python/
│   └── price_contour/
│       ├── solver.py              # OnlineOptimiser
│       ├── ratebook.py            # RatebookOptimiser + RatebookResult
│       ├── apply.py               # ApplyOptimiser + apply_from_grid
│       ├── frontier.py            # FrontierResult helpers + frontier_summary
│       └── builder.py             # QuoteGridBuilder wrapper
├── tests/
│   └── python/                    # Integration tests
├── notebooks/                     # Demo notebooks
├── docs/                          # Design documentation
└── scripts/                       # Utility scripts

The pure-Rust core (price-contour-core) has no Python dependencies and can be tested independently with cargo test. The PyO3 crate (price-contour) is a thin binding layer that converts between Polars DataFrames and the internal QuoteGrid representation with zero-copy where possible.


Development

# Clone
git clone https://github.com/PricingFrontier/price-contour.git
cd price-contour

# Install in development mode (compiles Rust, links Python)
uv sync --all-groups
maturin develop

# Run Rust tests
cargo test

# Run Python tests
pytest

# Rebuild after Rust changes
maturin develop

Requirements: Rust toolchain (stable), Python 3.10+, maturin.


API reference

OnlineOptimiser

Method Description
solve(df_or_grid, *, lambdas=None) Run full optimisation. Returns SolveResult.
frontier(df_or_grid, *, threshold_ranges, n_points_per_dim=10, initial_lambdas=None) Sweep the efficient frontier. Returns FrontierResult.
summary(result) Package result into MLflow-ready params, metrics, artifacts dicts.
config_dict() Serialisable solver configuration.

RatebookOptimiser

Method Description
solve(df_or_grid, factors, *, factor_columns=None, lambdas=None) Run ratebook optimisation via coordinate descent. Returns RatebookResult.
frontier(df_or_grid, factors, *, threshold_ranges, n_points_per_dim=5, factor_columns=None, initial_lambdas=None) Sweep the efficient frontier via coordinate descent at each threshold. Returns FrontierResult.
summary(result) Package result into MLflow-ready dicts.

ApplyOptimiser

Method Description
apply(df) Single-pass scoring with fixed lambdas. Returns ApplyResult.
save(path) Save config + lambdas to JSON.
ApplyOptimiser.load(path) Load from saved JSON.

QuoteGridBuilder

Method Description
append(df) Add a chunk of quotes.
build() Finalise and return a QuoteGrid.

SolveResult

Property Type Description
converged bool Whether the solver converged.
iterations int Number of iterations taken.
lambdas dict[str, float] Final Lagrange multipliers (shadow prices) per constraint.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
dataframe pl.DataFrame Per-quote results with optimal scenario values.
history list[dict] | None Per-iteration convergence records (if record_history=True).
n_quotes int Number of quotes in the grid.
n_steps int Number of scenario value steps.
scenario_values list[float] The scenario value grid.
grid QuoteGrid The internal grid (reusable for subsequent solves or apply).

ApplyResult

Property Type Description
total_objective float Portfolio-level objective.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
lambdas dict[str, float] Applied Lagrange multipliers.
dataframe pl.DataFrame Per-quote results with optimal scenario values.

FrontierResult

Property Type Description
points pl.DataFrame One row per frontier point with threshold_*, total_objective, total_*, lambda_*, iterations, converged, and scenario value statistics (sv_mean, sv_std, sv_min, sv_p5sv_p95, sv_max, sv_pct_increase, sv_pct_decrease).
n_points int Number of frontier points.

RatebookResult

Property Type Description
factor_tables dict[str, dict[str, float]] Factor name to level-value mapping.
lambdas dict[str, float] Final Lagrange multipliers.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
converged bool Whether coordinate descent converged.
cd_iterations int Coordinate descent iterations.
clamp_rate float Fraction of remappings that hit a grid boundary.
per_factor_results list[GroupedSolveResult] Per-factor inner solve results.
save(path) Save factor tables to a directory (one JSON per factor).
to_rating_entries() dict[str, pl.DataFrame] Convert to rating-step DataFrames.

Utility functions

Function Description
build_grid_from_parquet(path, *, constraint_columns, ...) Build a QuoteGrid directly from a Parquet file without materialising a DataFrame in Python.
apply_from_grid(grid, lambdas, constraints, *, chunk_size=500_000) Single-pass Lagrangian apply on an existing QuoteGrid. Returns ApplyResult.
frontier_summary(frontier_result, selected_index) Package a frontier result into MLflow-ready params, metrics, artifacts dicts.

License

Price Contour is licensed under the GNU Affero General Public License v3.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

price_contour-0.2.4-cp313-cp313-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.13Windows x86-64

price_contour-0.2.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

price_contour-0.2.4-cp313-cp313-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

price_contour-0.2.4-cp313-cp313-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.13macOS 10.12+ x86-64

price_contour-0.2.4-cp312-cp312-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.12Windows x86-64

price_contour-0.2.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

price_contour-0.2.4-cp312-cp312-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

price_contour-0.2.4-cp312-cp312-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.12macOS 10.12+ x86-64

price_contour-0.2.4-cp311-cp311-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.11Windows x86-64

price_contour-0.2.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.1 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

price_contour-0.2.4-cp311-cp311-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

price_contour-0.2.4-cp311-cp311-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

price_contour-0.2.4-cp310-cp310-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.10Windows x86-64

price_contour-0.2.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

price_contour-0.2.4-cp310-cp310-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

price_contour-0.2.4-cp310-cp310-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.10macOS 10.12+ x86-64

File details

Details for the file price_contour-0.2.4-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 368f817c65aacd834d215f74929cf85c892073e8d074f3b86f61aec079f6a951
MD5 ccc9bc1ec5844c936becb63577ffcf0b
BLAKE2b-256 1e10203034cfd77e576873bc14a5da2f12c354ac9b45c17febddf7fd33b11b19

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 da7117ed2e3b87e28466c36a73e27d12774d6dd481d22f54a0ce9299951de2e5
MD5 ca04d3a90bea2e7ffb129f1836a6884e
BLAKE2b-256 9fe0a103307206529db1190dcf43e801c3ca3ca5fa1f32d6a3ba66787b37f100

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 68ccbc294f5c09aaabdbd7a320cceb3165b76e3bfc0c996284903118a4be3c6b
MD5 810e49ca17f1129dc298891f93ad4a92
BLAKE2b-256 2d45b239d5b9999b889a3ee6bba2d1b2e6c6835a59a2b7046ef32c52aabb24fa

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp313-cp313-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp313-cp313-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 3878b2f398770653c16f4d03933f5a835ed3eb0d97825ef1b9b23ac30b25ae40
MD5 49af7d1877a091701aa6d92187ad970a
BLAKE2b-256 ac8b9e1d45e434943a7a3e48ceac97d3b05956cf3b81adea62ac577e3c3e3311

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp313-cp313-macosx_10_12_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 c9623b317dc78bae34bb70c5c2b53a8fd23d663ae2028b069e2967c7515e6f44
MD5 68f11be0d9d6678eb86380d06370a690
BLAKE2b-256 94adcf14c4c7e647c649d0c189594fa44a142b84874c3391b89d4cdf4aa815b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e79dce6a26f1b7e12f033211d929b57fe2a50c3193e0b4ba96dfb12b2a9deeff
MD5 aeabcdfc85b6832aa7305c3332f847cc
BLAKE2b-256 2d02ccdd9c0fc400ad2ba6f1c682e718451b3b2fb798a4adeb158b8f3e529eb5

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 205fa76bb8c9620139dd1ac43976272934cea8f1b402f4abea57d3502691db7f
MD5 4d8cb52fdc01a902ac207de9373ffe45
BLAKE2b-256 2ee9d7b4a2bea280b39e66ed1bba81de8be05ab96abe71cd045072e893c8a850

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp312-cp312-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp312-cp312-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 d70b6edd403e390c1c108cf4a319d306cf427d8e652df2a1d50472cef6b78f12
MD5 f424cbceb150cd8c999a53883d985361
BLAKE2b-256 49d1df65cb266420f4a58a0c1b5b14f22fb9d90a413b1e11f327f8963e99a0b1

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp312-cp312-macosx_10_12_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 e22c42d06e68fa31bd8eda3583593b35a187c30e8673716a56255e1a693c4105
MD5 c2db734de6c95ac7c1982b74c7722c47
BLAKE2b-256 aaebca77af769bab067e9cb329f47d90b8fd4c0ad50c020259aefd89d7dd8927

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ae63ddc78a9ed90cc2011545f9d19733655c18d1c67b374c4a5ba990bc5bb9e8
MD5 750b430767333b22111a8dfa43ee2974
BLAKE2b-256 dab66f45d197f58a5634460c287414ae61f1dd0e4ffcf5fad081e63f2de684ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3c6b1e4cd7c9f55d3e597547eca8b816f4666cd2d452ad0706b7b69a4ca03940
MD5 7678047bbb928c00192b22c9da037892
BLAKE2b-256 90e4cfc8257c9d03443aa86da627d10b84f071370c5bb95b80efe81e0795176c

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 dc0015c890ba3d0cd8691a138fe0c3dfb75ef0ce2386d0cb3e02a01122475377
MD5 81554efe7c2fd1bb17d1b8e741d7612b
BLAKE2b-256 30b21ebe04d38a18284ef8ee93163a12999ba151318133c77329937ed15725b6

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp311-cp311-macosx_10_12_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 5bf10531af7bab2f67b54df70f9da2abf3da49be1af1ec9f518bc53328ed5b4d
MD5 dfb2a1aece082a20d7becc37306b1a54
BLAKE2b-256 615bfdbda599188460c9517aa017babc0f7b5a0a17d809389a2fb76191bf76c8

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d8379b72479cd6dd52e5b2744d14ce1c5ea89c62437c7c22f69024d6aeb0171c
MD5 de65df0989b0dae0bb886a160ef645ce
BLAKE2b-256 e6d8e84c0caf10091d37be12fe7694eab3d74d0653d556e12960a44892bdfb1e

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 be764a6775560a4af71b7bb0538e1d9d5b034bbb6525cf5e256d8d8c879c458f
MD5 c52fe62c8e2d0e9b36b6d5e7ab686753
BLAKE2b-256 38b2f8e2846a1cdf1372727e469e57f528668c738ba2eeef574594ab28a7c7e3

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp310-cp310-macosx_11_0_arm64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file price_contour-0.2.4-cp310-cp310-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.4-cp310-cp310-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 456ae752812cdbbde45ec00c3da8215a53a588f35545fca04bd3a18855cd74dc
MD5 dc805c3035a0aecf0ce9a985662b5f0e
BLAKE2b-256 814d711a7ac1da1facf7ef53cc9623330335d349eb6a78493e35e24bd92ec2e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for price_contour-0.2.4-cp310-cp310-macosx_10_12_x86_64.whl:

Publisher: publish.yml on PricingFrontier/price-contour

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page