Skip to main content

High-performance insurance price optimisation via Lagrangian dual decomposition

Project description

Price Contour

High-performance insurance price optimisation via Lagrangian dual decomposition.


Python 3.10+ Rust Polars AGPL-3.0


Price Contour finds optimal price scenario values across a portfolio of insurance risks subject to business constraints. Give it a scored dataset with objective and constraint values at discrete price points, and it returns the scenario value per quote that maximises your objective while respecting every constraint.

The core algorithm is Lagrangian dual decomposition, implemented in Rust for speed and exposed to Python via zero-copy Polars DataFrames. A portfolio of 1M+ risks solves in seconds.


Quick start

uv add price-contour
import polars as pl
import price_contour as pc

# Long-format DataFrame: one row per (quote, price_scenario)
# with pre-computed objective and constraint values
df = pl.read_parquet("scored_quotes.parquet")

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={"volume": {"min_pct": 0.90}},  # retain at least 90% of baseline volume
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value="scenario_value",
)

result = optimiser.solve(df)

print(result.converged)        # True
print(result.iterations)       # 23
print(result.lambdas)          # {'volume': 0.147}
print(result.total_objective)  # 1_284_302.5

# Per-quote optimal scenario values as a Polars DataFrame
out = result.dataframe
print(out.head())
# ┌──────────┬──────────────┬────────────────────┬─────────────────────┬──────────────────┐
# │ quote_id │ optimal_step │ optimal_scenario_value │ optimal_income      │ optimal_volume   │
# ╞══════════╪══════════════╪════════════════════╪═════════════════════╪══════════════════╡
# │ Q001     │ 14           │ 1.07               │ 42.30               │ 0.82             │
# │ Q002     │ 11           │ 0.98               │ 18.55               │ 0.91             │
# └──────────┴──────────────┴────────────────────┴─────────────────────┴──────────────────┘

What it does

Price Contour operates on pre-computed scenario data. It does not fit models or generate demand curves. Upstream, your pricing pipeline scores every quote at a grid of price scenario values (e.g. 0.8, 0.85, 0.9, ..., 1.2) and computes what the expected income, volume, loss ratio, etc. would be at each point. Price Contour then selects the optimal scenario value per quote across the portfolio.

The input is a long-format Polars DataFrame:

quote_id scenario_index scenario_value income volume loss_ratio
Q001 0 0.80 85.2 0.95 0.62
Q001 1 0.90 92.1 0.88 0.59
Q001 2 1.00 100.0 0.80 0.60
Q002 0 0.80 42.0 0.97 0.58
... ... ... ... ... ...

The output is one optimal scenario value per quote, chosen to maximise portfolio-level income while keeping portfolio-level volume above 90% of baseline (or whatever constraints you set).


Three optimisation modes

Online optimisation

Find the optimal scenario value per individual quote. Each quote independently picks its best price point, coordinated by shared Lagrange multipliers that enforce portfolio-level constraints.

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={
        "volume": {"min_pct": 0.90},                # sum constraint
        "loss_ratio": {                              # ratio constraint
            "numerator":   "incurred",
            "denominator": "premium",
            "max":         0.65,
        },
    },
)
result = optimiser.solve(df)
print(result.lambdas)            # {'volume': 0.147, 'loss_ratio': 1.21}
print(result.total_constraints)  # {'volume': 5400.0, 'loss_ratio': 0.6498}

Both sum and ratio constraints work in all three optimisation modes (online, ratebook, apply) and in the efficient-frontier sweep.

Ratebook optimisation

Find optimal rating factors across rating dimensions. Instead of individual scenario values, find the best factor value for each level of each rating factor (e.g. age band, region, vehicle power), applied uniformly to all quotes sharing that level.

optimiser = pc.RatebookOptimiser(
    objective="income",
    constraints={"volume": {"min_pct": 0.90}},
    factor_columns=[["age_band"], ["region"], ["vehicle_power"]],
)

result = optimiser.solve(df, factors=factor_df)

print(result.factor_tables)
# {'age_band': {'18-25': 1.15, '26-35': 1.02, '36-50': 0.95, '51+': 0.98},
#  'region': {'London': 1.08, 'South East': 1.01, 'North': 0.93},
#  'vehicle_power': {'Low': 0.97, 'Medium': 1.0, 'High': 1.06}}

# Save to disk
result.save("parameters/")

# Convert to rating-step DataFrames
tables = result.to_rating_entries()

Live scoring with stored lambdas

Apply pre-computed Lagrange multipliers to new quotes in a single forward pass, with no iteration. Use this in production to score individual quotes using lambdas learned from a batch solve.

# Batch solve (offline)
result = optimiser.solve(df_portfolio)
lambdas = result.lambdas

# Live scoring (per-quote, no iteration)
applier = pc.ApplyOptimiser(
    lambdas=lambdas,
    objective="income",
    constraints={"volume": {"min_pct": 0.90}},
)
applier.save("config/applier.json")

# Later, in production:
applier = pc.ApplyOptimiser.load("config/applier.json")
live_result = applier.apply(df_single_quote)
optimal_scenario_value = live_result.dataframe["optimal_scenario_value"][0]

Efficient frontier

Sweep constraint thresholds to generate the Pareto frontier - the trade-off curve between your objective and constraints. Each point on the frontier is a full portfolio solve at a different constraint target.

frontier = optimiser.frontier(
    df,
    threshold_ranges={"volume": (0.85, 1.0)},
    n_points_per_dim=20,
)

# DataFrame with one row per frontier point
print(frontier.points)
# ┌──────────────────┬─────────────────┬──────────────┬───────────────┬────────────┬───────────┬─────────┬─────────────────┐
# │ threshold_volume │ total_objective │ total_volume │ lambda_volume │ iterations │ converged │ sv_mean │ sv_pct_increase │
# ╞══════════════════╪═════════════════╪══════════════╪═══════════════╪════════════╪═══════════╪═════════╪═════════════════╡
# │ 0.85             │ 1_350_102       │ 0.851        │ 0.089         │ 18         │ true      │ 1.04    │ 0.62            │
# │ 0.86             │ 1_342_891       │ 0.861        │ 0.102         │ 21         │ true      │ 1.03    │ 0.58            │
# │ ...              │ ...             │ ...          │ ...           │ ...        │ ...       │ ...     │ ...             │
# └──────────────────┴─────────────────┴──────────────┴───────────────┴────────────┴───────────┴─────────┴─────────────────┘

Adjacent points are warm-started from each other (nearest-neighbour traversal of the threshold grid), so the full frontier solves much faster than running each point independently. Each point also includes scenario value distribution statistics (sv_mean, sv_std, percentiles, sv_pct_increase/sv_pct_decrease).

Sweeping a ratio target — declare the constraint with None so the constructor doesn't fix it, then supply the range to frontier():

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={
        "loss_ratio": {
            "numerator":   "incurred",
            "denominator": "premium",
            "max":         None,       # frontier supplies the target
        },
    },
)
frontier = optimiser.frontier(
    df,
    threshold_ranges={"loss_ratio": (0.55, 0.75)},
    n_points_per_dim=10,
)
# points["threshold_loss_ratio"] = [0.55, 0.572, ..., 0.75]  (user units, verbatim)
# points["total_loss_ratio"]     = actual Σ incurred / Σ premium at each optimum

Mixed sweep — sweep multiple constraints at once via the cartesian product:

frontier = optimiser.frontier(
    df,
    threshold_ranges={
        "volume":     (8000, 12000),    # absolute units
        "loss_ratio": (0.55, 0.75),     # absolute ratio targets
    },
    n_points_per_dim=10,
)
# 10 × 10 = 100 frontier points

Constraints with numeric thresholds may be omitted from threshold_ranges — they are held fixed at the constructor value across the sweep. None thresholds must have a range entry.


Constraint format

Constraints are specified as a dictionary. There are two shapes:

Sum constraints apply to a single column. The dict key is the column name in your DataFrame, the value specifies direction and threshold. Use min / max for absolute thresholds and min_pct / max_pct for thresholds expressed as a fraction of baseline (the portfolio totals at scenario_value = 1.0):

constraints = {
    "volume":  {"min_pct": 0.90},     # portfolio volume >= 90% of baseline
    "premium": {"min": 1_000_000},    # absolute: portfolio premium >= 1M
    "claims":  {"max_pct": 1.05},     # portfolio claims <= 105% of baseline
}

Ratio constraints apply to a ratio of two summed columns (e.g. loss ratio = Σ incurred / Σ premium). The dict key is a display label (does NOT need to be a column); numerator and denominator name the columns:

constraints = {
    "loss_ratio": {
        "numerator":   "incurred",
        "denominator": "premium",
        "max":         0.65,           # portfolio loss ratio <= 0.65
    },
    "combined_ratio": {
        "numerator":   "claims_plus_expenses",
        "denominator": "premium",
        "max_pct":     1.10,           # <= 110% of baseline combined ratio
    },
}

Internally, ratio constraints are linearised as Σ (num − L·denom) ≤ 0 and handed to the same Lagrangian solver. Setting Σ_baseline denom == 0 raises ValueError for _pct modes (baseline ratio undefined). If Σ_optimum denom == 0 at the chosen step set, the ratio reported in total_constraints[label] and summary() is nan (sentinel; the divide is undefined, not silently zero).

None thresholds mark frontier-only constraints — the threshold is supplied by the sweep range:

constraints = {
    "loss_ratio": {
        "numerator":   "incurred",
        "denominator": "premium",
        "max":         None,           # frontier supplies the target
    },
}

frontier = optimiser.frontier(
    df,
    threshold_ranges={"loss_ratio": (0.55, 0.75)},
    n_points_per_dim=10,
)

solve() rejects None thresholds; frontier() requires a threshold_ranges entry for every None constraint. Numeric-threshold constraints are optional in threshold_ranges — omitted ones are held fixed at their constructor value across the sweep.

points["threshold_<name>"] reports the user-supplied range value verbatim (absolute units for min/max, fractions of baseline for min_pct/max_pct); points["total_<name>"] reports the actual aggregate at the optimum (the actual ratio for ratio constraints).


Direct Parquet loading

For large datasets, build the internal grid directly from a Parquet file without materialising a DataFrame in Python memory:

grid = pc.build_grid_from_parquet(
    "scored_quotes.parquet",
    constraint_columns=["volume", "loss_ratio"],
    objective="income",
)
result = optimiser.solve(grid)

Incremental grid building

For large datasets that don't fit in memory at once, build the internal grid incrementally:

builder = pc.QuoteGridBuilder(
    ["volume", "loss_ratio"],
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value_col="scenario_value",
    objective="income",
)

for chunk in data_source.iter_chunks(100_000):
    builder.append(chunk)

grid = builder.build()
result = optimiser.solve(grid)

MLflow integration

Both OnlineOptimiser and RatebookOptimiser produce MLflow-ready summaries:

result = optimiser.solve(df)
summary = optimiser.summary(result)

import mlflow
mlflow.log_params(summary["params"])
mlflow.log_metrics(summary["metrics"])
mlflow.log_dict(summary["artifacts"]["lambdas"], "lambdas.json")
mlflow.log_dict(summary["artifacts"]["config"], "config.json")

How it works

The algorithm

Price Contour solves the constrained optimisation problem:

Maximise    sum_i  objective(quote_i, scenario_value_i)
Subject to  sum_i  constraint_k(quote_i, scenario_value_i) >= threshold_k   for all k
            scenario_value_i in {discrete grid}

This is a combinatorial problem (each quote picks from M discrete scenario values). Lagrangian dual decomposition relaxes the coupling constraints into the objective using dual variables (lambdas), decomposing it into N independent per-quote subproblems:

For fixed lambdas:
    Each quote picks:  argmax_m [ objective(i, m) + sum_k lambda_k * constraint_k(i, m) ]

These are independent and embarrassingly parallel.

The outer loop updates lambdas via the subgradient method with adaptive step sizes, iterating until all constraints are satisfied and lambdas converge.

Performance

The Rust core uses:

  • Quote-major memory layout - each quote's M scenario values are contiguous, optimising the per-quote argmax inner loop for cache locality
  • Rayon parallelism - the argmax across quotes is parallelised in grain sizes of 4096 quotes
  • Adaptive step scaling - per-constraint scale factors normalise for differing magnitudes, so the algorithm works equally well for constraints ranging from 0.1 to 1,000,000
  • Lambda averaging - smooths the oscillations inherent in discrete Lagrangian relaxation where all quotes can flip simultaneously

Ratebook mode

For ratebook optimisation, coordinate descent iterates over rating factors. For each factor, a grouped Lagrangian solve finds the best discrete factor value per group (e.g. per age band), with the individual quote scenario value computed as the product of all factor values times a per-quote residual. The inner grouped solve uses the same Lagrangian machinery with remapping to the nearest grid point.


Architecture

price-contour/
├── crates/
│   ├── price-contour-core/        # Pure Rust: algorithms, data structures, solver
│   │   └── src/
│   │       ├── data.rs            # QuoteGrid, SolverConfig, SolveResult, GroupMapping
│   │       ├── solver/
│   │       │   ├── online.rs      # Lagrangian dual decomposition
│   │       │   ├── grouped.rs     # Grouped solve (ratebook inner loop)
│   │       │   ├── argmax.rs      # Per-quote Lagrangian argmax (parallel)
│   │       │   ├── lambda.rs      # Subgradient lambda updates
│   │       │   └── apply.rs       # Fixed-lambda forward pass
│   │       ├── frontier.rs        # Efficient frontier sweeping
│   │       ├── constants.rs       # Solver defaults
│   │       └── error.rs           # Error types
│   └── price-contour/             # PyO3 bindings (thin wrappers)
│       └── src/
│           ├── solver_py.rs       # DataFrame ingestion + solve
│           ├── grouped_py.rs      # Grouped solve bindings
│           ├── apply_py.rs        # Apply bindings
│           ├── frontier_py.rs     # Frontier bindings
│           ├── builder_py.rs      # QuoteGridBuilder bindings
│           ├── grid_py.rs         # QuoteGrid bindings
│           └── parquet_grid_py.rs # Parquet → QuoteGrid loader
├── python/
│   └── price_contour/
│       ├── solver.py              # OnlineOptimiser, ratio linearisation, validation
│       ├── ratebook.py            # RatebookOptimiser + RatebookResult
│       ├── apply.py               # ApplyOptimiser + apply_from_grid
│       ├── frontier.py            # FrontierResult helpers + frontier_summary
│       ├── builder.py             # QuoteGridBuilder wrapper
│       ├── _ratio_results.py      # Shared ratio reporting (actual ratios + column stitching)
│       └── _frontier_helpers.py   # Shared frontier orchestrator (used by online + ratebook)
├── tests/
│   └── python/                    # Integration tests
├── notebooks/                     # Demo notebooks
├── docs/                          # Design documentation
└── scripts/                       # Utility scripts

The pure-Rust core (price-contour-core) has no Python dependencies and can be tested independently with cargo test. The PyO3 crate (price-contour) is a thin binding layer that converts between Polars DataFrames and the internal QuoteGrid representation with zero-copy where possible.


Development

# Clone
git clone https://github.com/PricingFrontier/price-contour.git
cd price-contour

# Install in development mode (compiles Rust, links Python)
uv sync --all-groups
maturin develop

# Run Rust tests
cargo test

# Run Python tests
pytest

# Rebuild after Rust changes
maturin develop

Requirements: Rust toolchain (stable), Python 3.10+, maturin.


API reference

OnlineOptimiser

Method Description
solve(df_or_grid, *, lambdas=None) Run full optimisation. Returns SolveResult. Ratio constraints require a DataFrame (the linearisation needs raw numerator/denominator columns); a pre-built QuoteGrid with ratio constraints raises ValueError.
frontier(df_or_grid, *, threshold_ranges, n_points_per_dim=10, initial_lambdas=None) Sweep the efficient frontier. Returns FrontierResult. Numeric thresholds are optional in threshold_ranges (held fixed if omitted); None thresholds require a range.
summary(result) Package result into MLflow-ready params, metrics, artifacts dicts.
config_dict() Serialisable solver configuration.

RatebookOptimiser

Method Description
solve(df_or_grid, factors, *, factor_columns=None, lambdas=None) Run ratebook optimisation via coordinate descent. Returns RatebookResult.
frontier(df_or_grid, factors, *, threshold_ranges, n_points_per_dim=5, factor_columns=None, initial_lambdas=None) Sweep the efficient frontier via coordinate descent at each threshold. Returns FrontierResult.
summary(result) Package result into MLflow-ready dicts.

ApplyOptimiser

Method Description
apply(df) Single-pass scoring with fixed lambdas. Returns ApplyResult. For ratio constraints, min_pct/max_pct resolve L = pct × baseline_LR from the apply-time DataFrame (live-scoring contract), not the solve-time baseline.
save(path) Save config + lambdas to JSON. Ratio specs round-trip verbatim.
ApplyOptimiser.load(path) Load from saved JSON. Rejects unknown keys.

QuoteGridBuilder

Method Description
append(df) Add a chunk of quotes.
build() Finalise and return a QuoteGrid.

SolveResult

Property Type Description
converged bool Whether the solver converged.
iterations int Number of iterations taken.
lambdas dict[str, float] Final Lagrange multipliers (shadow prices) per constraint.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
dataframe pl.DataFrame Per-quote results with optimal scenario values.
history list[dict] | None Per-iteration convergence records (if record_history=True).
n_quotes int Number of quotes in the grid.
n_steps int Number of scenario value steps.
scenario_values list[float] The scenario value grid.
grid QuoteGrid The internal grid (reusable for subsequent solves or apply).

ApplyResult

Property Type Description
total_objective float Portfolio-level objective.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
lambdas dict[str, float] Applied Lagrange multipliers.
dataframe pl.DataFrame Per-quote results with optimal scenario values.

FrontierResult

Property Type Description
points pl.DataFrame One row per frontier point with threshold_*, total_objective, total_*, lambda_*, iterations, converged, and scenario value statistics (sv_mean, sv_std, sv_min, sv_p5sv_p95, sv_max, sv_pct_increase, sv_pct_decrease).
n_points int Number of frontier points.

RatebookResult

Property Type Description
factor_tables dict[str, dict[str, float]] Factor name to level-value mapping.
lambdas dict[str, float] Final Lagrange multipliers.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
converged bool Whether coordinate descent converged.
cd_iterations int Coordinate descent iterations.
clamp_rate float Fraction of remappings that hit a grid boundary.
per_factor_results list[GroupedSolveResult] Per-factor inner solve results.
save(path) Save factor tables to a directory (one JSON per factor).
to_rating_entries() dict[str, pl.DataFrame] Convert to rating-step DataFrames.

Utility functions

Function Description
build_grid_from_parquet(path, *, constraint_columns, ...) Build a QuoteGrid directly from a Parquet file without materialising a DataFrame in Python. Sum constraints only — ratio constraints require a DataFrame.
apply_from_grid(grid, lambdas, constraints) Single-pass Lagrangian apply on an existing QuoteGrid. Returns ApplyResult. Sum constraints only; ratio constraints raise ValueError (use ApplyOptimiser.apply(df) on a DataFrame instead — the grid path can't carry numerator/denominator columns for linearisation).
frontier_summary(frontier_result, selected_index) Package a frontier result into MLflow-ready params, metrics, artifacts dicts.

License

Price Contour is licensed under the GNU Affero General Public License v3.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

price_contour-0.3.0.tar.gz (126.4 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

price_contour-0.3.0-cp313-cp313-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.13Windows x86-64

price_contour-0.3.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

price_contour-0.3.0-cp313-cp313-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

price_contour-0.3.0-cp313-cp313-macosx_10_12_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.13macOS 10.12+ x86-64

price_contour-0.3.0-cp312-cp312-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.12Windows x86-64

price_contour-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

price_contour-0.3.0-cp312-cp312-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

price_contour-0.3.0-cp312-cp312-macosx_10_12_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.12macOS 10.12+ x86-64

price_contour-0.3.0-cp311-cp311-win_amd64.whl (6.5 MB view details)

Uploaded CPython 3.11Windows x86-64

price_contour-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

price_contour-0.3.0-cp311-cp311-macosx_11_0_arm64.whl (5.3 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

price_contour-0.3.0-cp311-cp311-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

price_contour-0.3.0-cp310-cp310-win_amd64.whl (6.5 MB view details)

Uploaded CPython 3.10Windows x86-64

price_contour-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

price_contour-0.3.0-cp310-cp310-macosx_11_0_arm64.whl (5.3 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

price_contour-0.3.0-cp310-cp310-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.10macOS 10.12+ x86-64

File details

Details for the file price_contour-0.3.0.tar.gz.

File metadata

  • Download URL: price_contour-0.3.0.tar.gz
  • Upload date:
  • Size: 126.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for price_contour-0.3.0.tar.gz
Algorithm Hash digest
SHA256 984fade478cb3b65565fc705b8bcef5d20a5daa64fa4267cbd5ef9b2de94fb67
MD5 7befe0a3fc30f3836360a352ed591a95
BLAKE2b-256 a3a3e2440747bce42f52ead57c5ec92521a5532cd103ce08367774dd69662314

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 50963b901f62beaf5a7df764a0959ea69ace21e8122a3153d641a816ea5b0f91
MD5 c758548af93a4d55644b3d961e1e5df3
BLAKE2b-256 307665a5923d92f14f835671648b4b55decb9a89a94dc2943721f9abe0152682

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 8d88975b91e5569e50e8d8daf8de349975df44d994eb900d9c154954950bd6dc
MD5 33f8205f8f972aaeb32c8f4e641f0846
BLAKE2b-256 8ca7d3cae37b313be1a81a43dcdf3bd4baddf2748716d4e68df25b2e7ac5a24f

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 65c95d28859fac082de8b346b5c294087dec67fd4c5e7b73a0af7202c9fd36a1
MD5 b2075d9a4223c288b7260571669de3c8
BLAKE2b-256 d865b69bda373b21c92cbc059594848a7882e7e650d1718ad658b87c297813b1

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp313-cp313-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp313-cp313-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 8c7f6a601ff7c50623854b26cb50b2e7e2d70350f1221028517af685dffb4b50
MD5 340f250ae6457498abcf9d66665694c4
BLAKE2b-256 07673ae522e2ab209995d463e5d41a0c38baad5a56951506170264d65b03563d

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 6316b53c44d2c3c7ebc11d85b7dcd2544f149b8a09ea2be07b893907b50ce37d
MD5 390242b17453312b7103a350ea5de211
BLAKE2b-256 8b028e00b64f2d0bb5a795f8f3cac2f1acd6787b9cfe16271b51b0261a8bb4f1

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b877db3440ee90174684764b0c4cb0022d7077337c5217c2995323566abdf64a
MD5 8417557426885faae75da6f8b00fbb4c
BLAKE2b-256 1200aa0150c88b2a9a9dd612342ecb0a783b1af9a8f187b7bb522a752b2a906f

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 057efa87201b2f062d5c19a1c5aa2eb8fb410c1347e8e93086c90ad2e2616068
MD5 b8601c589d20d4224fedd8edf7775d1a
BLAKE2b-256 25e9c2d146e8490954368ee5d89ce1088f2464ffda0768398713420e36286784

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp312-cp312-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp312-cp312-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 cb58a669f3f70a39659d287baad2721ff8be6a80f9ef54084412724120266786
MD5 afd3c7d52a91363a9f03a59d3560740a
BLAKE2b-256 8c8a67bfa3dc8cba6a838ad363cb9e35d74d07bf63a7d75043e238e32b52c059

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 b40cc196500db144349292593123782ebeb3b1a9f029e703326495e1f15c436d
MD5 d181f6562335eea6a95794fd46c59cb8
BLAKE2b-256 c24cd328e2a97f268f17c378cc2bf792cd91b8c1c2ec3dd02e687a117e887882

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f6a285954f087e22247ecc1281fdb43d067d34dcecdb0014054c89a419820c33
MD5 938390df93250e328a9d74e25a579b1a
BLAKE2b-256 0f6103869f760c2e6b902574c0603e99cec9712052f096dab8858dab24220e95

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2d038a1a276def345f5d18887e9fd766f7a004913ba8b8966c613f53b87eadd0
MD5 ef8d2d50b5508204dc57065b529e5cac
BLAKE2b-256 ed3dd64fb7cb8951688116cd930c9ea9c5eb8f1af453d6700eb901e9e8aee1d3

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 573ade439547a0749c319a54420ee7511497df30565b1c94ed341671db3c7451
MD5 3e36dc900f0a3aa561f91a122ce5e5a9
BLAKE2b-256 0ea70af448424c241fc39119e4d8bf2380a5a3ed3e608f8766f9cb796030ad07

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 5cfac873881270c26458f16860f7c12d50b044f16735b7e6e8f56a649ff4b04a
MD5 6266f06f3422ef281b95fae1336987ec
BLAKE2b-256 922349a7e32bf71de3c1cd631dad121e6518c54a9c9b8f7ac94e28921651c005

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f7266d4acd3b4f0135552cbef59e2effa4ec44f9a7b51ca48ba5abfaac55cb89
MD5 cef109a39ebfeb841e3f28ad9c53e48d
BLAKE2b-256 fc5feaa201cf73fc928f7399f16914997745d816b3c5346be68b6422721fe915

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0a3d15ed24833144382819c3742d934cc5189f7fc72587b14d7a9e7b6b2eaae2
MD5 797bd48a42c9e5af8f125f1fc04b0de2
BLAKE2b-256 390afb85458598aa0a20c7a01f82113098caad5baa1d68c83d0a0b435448f5b1

See more details on using hashes here.

File details

Details for the file price_contour-0.3.0-cp310-cp310-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.3.0-cp310-cp310-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 73daf48922f7d4e85b6b814d2a4ee56723c07fce473696b0a8e2e6c2bcac351a
MD5 e780d1956e287051b737c1226dc8f532
BLAKE2b-256 010c12eb4f127edc00df0c3fabb73ac0d2f00dc570f67ec1ad82c19ab3ddc08e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page