Skip to main content

High-performance insurance price optimisation via Lagrangian dual decomposition

Project description

Price Contour

High-performance insurance price optimisation via Lagrangian dual decomposition.


Python 3.10+ Rust Polars AGPL-3.0


Price Contour finds optimal price scenario values across a portfolio of insurance risks subject to business constraints. Give it a scored dataset with objective and constraint values at discrete price points, and it returns the scenario value per quote that maximises your objective while respecting every constraint.

The core algorithm is Lagrangian dual decomposition, implemented in Rust for speed and exposed to Python via zero-copy Polars DataFrames. A portfolio of 1M+ risks solves in seconds.


Quick start

uv add price-contour
import polars as pl
import price_contour as pc

# Long-format DataFrame: one row per (quote, price_scenario)
# with pre-computed objective and constraint values
df = pl.read_parquet("scored_quotes.parquet")

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},  # retain at least 90% of baseline volume
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value="scenario_value",
)

result = optimiser.solve(df)

print(result.converged)        # True
print(result.iterations)       # 23
print(result.lambdas)          # {'volume': 0.147}
print(result.total_objective)  # 1_284_302.5

# Per-quote optimal scenario values as a Polars DataFrame
out = result.dataframe
print(out.head())
# ┌──────────┬──────────────┬────────────────────┬─────────────────────┬──────────────────┐
# │ quote_id │ optimal_step │ optimal_scenario_value │ optimal_income      │ optimal_volume   │
# ╞══════════╪══════════════╪════════════════════╪═════════════════════╪══════════════════╡
# │ Q001     │ 14           │ 1.07               │ 42.30               │ 0.82             │
# │ Q002     │ 11           │ 0.98               │ 18.55               │ 0.91             │
# └──────────┴──────────────┴────────────────────┴─────────────────────┴──────────────────┘

What it does

Price Contour operates on pre-computed scenario data. It does not fit models or generate demand curves. Upstream, your pricing pipeline scores every quote at a grid of price scenario values (e.g. 0.8, 0.85, 0.9, ..., 1.2) and computes what the expected income, volume, loss ratio, etc. would be at each point. Price Contour then selects the optimal scenario value per quote across the portfolio.

The input is a long-format Polars DataFrame:

quote_id scenario_index scenario_value income volume loss_ratio
Q001 0 0.80 85.2 0.95 0.62
Q001 1 0.90 92.1 0.88 0.59
Q001 2 1.00 100.0 0.80 0.60
Q002 0 0.80 42.0 0.97 0.58
... ... ... ... ... ...

The output is one optimal scenario value per quote, chosen to maximise portfolio-level income while keeping portfolio-level volume above 90% of baseline (or whatever constraints you set).


Three optimisation modes

Online optimisation

Find the optimal scenario value per individual quote. Each quote independently picks its best price point, coordinated by shared Lagrange multipliers that enforce portfolio-level constraints.

optimiser = pc.OnlineOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},
)
result = optimiser.solve(df)

Ratebook optimisation

Find optimal rating factors across rating dimensions. Instead of individual scenario values, find the best factor value for each level of each rating factor (e.g. age band, region, vehicle power), applied uniformly to all quotes sharing that level.

optimiser = pc.RatebookOptimiser(
    objective="income",
    constraints={"volume": {"min": 0.90}},
    factor_columns=[["age_band"], ["region"], ["vehicle_power"]],
)

result = optimiser.solve(df, factors=factor_df)

print(result.factor_tables)
# {'age_band': {'18-25': 1.15, '26-35': 1.02, '36-50': 0.95, '51+': 0.98},
#  'region': {'London': 1.08, 'South East': 1.01, 'North': 0.93},
#  'vehicle_power': {'Low': 0.97, 'Medium': 1.0, 'High': 1.06}}

# Save to disk
result.save("parameters/")

# Convert to rating-step DataFrames
tables = result.to_rating_entries()

Live scoring with stored lambdas

Apply pre-computed Lagrange multipliers to new quotes in a single forward pass, with no iteration. Use this in production to score individual quotes using lambdas learned from a batch solve.

# Batch solve (offline)
result = optimiser.solve(df_portfolio)
lambdas = result.lambdas

# Live scoring (per-quote, no iteration)
applier = pc.ApplyOptimiser(
    lambdas=lambdas,
    objective="income",
    constraints={"volume": {"min": 0.90}},
)
applier.save("config/applier.json")

# Later, in production:
applier = pc.ApplyOptimiser.load("config/applier.json")
live_result = applier.apply(df_single_quote)
optimal_scenario_value = live_result.dataframe["optimal_scenario_value"][0]

Efficient frontier

Sweep constraint thresholds to generate the Pareto frontier - the trade-off curve between your objective and constraints. Each point on the frontier is a full portfolio solve at a different constraint target.

frontier = optimiser.frontier(
    df,
    threshold_ranges={"volume": (0.85, 1.0)},
    n_points_per_dim=20,
)

# DataFrame with one row per frontier point
print(frontier.points)
# ┌──────────────────┬─────────────────┬──────────────┬───────────────┬────────────┬───────────┬─────────┬─────────────────┐
# │ threshold_volume │ total_objective │ total_volume │ lambda_volume │ iterations │ converged │ sv_mean │ sv_pct_increase │
# ╞══════════════════╪═════════════════╪══════════════╪═══════════════╪════════════╪═══════════╪═════════╪═════════════════╡
# │ 0.85             │ 1_350_102       │ 0.851        │ 0.089         │ 18         │ true      │ 1.04    │ 0.62            │
# │ 0.86             │ 1_342_891       │ 0.861        │ 0.102         │ 21         │ true      │ 1.03    │ 0.58            │
# │ ...              │ ...             │ ...          │ ...           │ ...        │ ...       │ ...     │ ...             │
# └──────────────────┴─────────────────┴──────────────┴───────────────┴────────────┴───────────┴─────────┴─────────────────┘

Adjacent points are warm-started from each other (nearest-neighbour traversal of the threshold grid), so the full frontier solves much faster than running each point independently. Each point also includes scenario value distribution statistics (sv_mean, sv_std, percentiles, sv_pct_increase/sv_pct_decrease).


Constraint format

Constraints are specified as a dictionary. Keys are column names in your DataFrame, values specify the direction and threshold relative to the baseline (the portfolio totals at scenario_value = 1.0):

constraints = {
    "volume": {"min": 0.90},            # portfolio volume >= 90% of baseline
    "loss_ratio": {"max": 1.05},        # portfolio loss ratio <= 105% of baseline
    "premium": {"min_abs": 1_000_000},  # absolute: portfolio premium >= 1M
}

Direct Parquet loading

For large datasets, build the internal grid directly from a Parquet file without materialising a DataFrame in Python memory:

grid = pc.build_grid_from_parquet(
    "scored_quotes.parquet",
    constraint_columns=["volume", "loss_ratio"],
    objective="income",
)
result = optimiser.solve(grid)

Incremental grid building

For large datasets that don't fit in memory at once, build the internal grid incrementally:

builder = pc.QuoteGridBuilder(
    ["volume", "loss_ratio"],
    quote_id="quote_id",
    scenario_index="scenario_index",
    scenario_value_col="scenario_value",
    objective="income",
)

for chunk in data_source.iter_chunks(100_000):
    builder.append(chunk)

grid = builder.build()
result = optimiser.solve(grid)

MLflow integration

Both OnlineOptimiser and RatebookOptimiser produce MLflow-ready summaries:

result = optimiser.solve(df)
summary = optimiser.summary(result)

import mlflow
mlflow.log_params(summary["params"])
mlflow.log_metrics(summary["metrics"])
mlflow.log_dict(summary["artifacts"]["lambdas"], "lambdas.json")
mlflow.log_dict(summary["artifacts"]["config"], "config.json")

How it works

The algorithm

Price Contour solves the constrained optimisation problem:

Maximise    sum_i  objective(quote_i, scenario_value_i)
Subject to  sum_i  constraint_k(quote_i, scenario_value_i) >= threshold_k   for all k
            scenario_value_i in {discrete grid}

This is a combinatorial problem (each quote picks from M discrete scenario values). Lagrangian dual decomposition relaxes the coupling constraints into the objective using dual variables (lambdas), decomposing it into N independent per-quote subproblems:

For fixed lambdas:
    Each quote picks:  argmax_m [ objective(i, m) + sum_k lambda_k * constraint_k(i, m) ]

These are independent and embarrassingly parallel.

The outer loop updates lambdas via the subgradient method with adaptive step sizes, iterating until all constraints are satisfied and lambdas converge.

Performance

The Rust core uses:

  • Quote-major memory layout - each quote's M scenario values are contiguous, optimising the per-quote argmax inner loop for cache locality
  • Rayon parallelism - the argmax across quotes is parallelised within chunks of 4096 quotes
  • Chunked processing - large portfolios are processed in chunks (default 500K quotes) to bound memory usage
  • Adaptive step scaling - per-constraint scale factors normalise for differing magnitudes, so the algorithm works equally well for constraints ranging from 0.1 to 1,000,000
  • Lambda averaging - smooths the oscillations inherent in discrete Lagrangian relaxation where all quotes can flip simultaneously

Ratebook mode

For ratebook optimisation, coordinate descent iterates over rating factors. For each factor, a grouped Lagrangian solve finds the best discrete factor value per group (e.g. per age band), with the individual quote scenario value computed as the product of all factor values times a per-quote residual. The inner grouped solve uses the same Lagrangian machinery with remapping to the nearest grid point.


Architecture

price-contour/
├── crates/
│   ├── price-contour-core/        # Pure Rust: algorithms, data structures, solver
│   │   └── src/
│   │       ├── data.rs            # QuoteGrid, SolverConfig, SolveResult, GroupMapping
│   │       ├── solver/
│   │       │   ├── online.rs      # Lagrangian dual decomposition
│   │       │   ├── grouped.rs     # Grouped solve (ratebook inner loop)
│   │       │   ├── argmax.rs      # Per-quote Lagrangian argmax (parallel)
│   │       │   ├── lambda.rs      # Subgradient lambda updates
│   │       │   └── apply.rs       # Fixed-lambda forward pass
│   │       ├── frontier.rs        # Efficient frontier sweeping
│   │       ├── constants.rs       # Solver defaults
│   │       └── error.rs           # Error types
│   └── price-contour/             # PyO3 bindings (thin wrappers)
│       └── src/
│           ├── solver_py.rs       # DataFrame ingestion + solve
│           ├── grouped_py.rs      # Grouped solve bindings
│           ├── apply_py.rs        # Apply bindings
│           ├── frontier_py.rs     # Frontier bindings
│           ├── builder_py.rs      # QuoteGridBuilder bindings
│           ├── grid_py.rs         # QuoteGrid bindings
│           └── parquet_grid_py.rs # Parquet → QuoteGrid loader
├── python/
│   └── price_contour/
│       ├── solver.py              # OnlineOptimiser
│       ├── ratebook.py            # RatebookOptimiser + RatebookResult
│       ├── apply.py               # ApplyOptimiser + apply_from_grid
│       ├── frontier.py            # FrontierResult helpers + frontier_summary
│       └── builder.py             # QuoteGridBuilder wrapper
├── tests/
│   └── python/                    # Integration tests
├── notebooks/                     # Demo notebooks
├── docs/                          # Design documentation
└── scripts/                       # Utility scripts

The pure-Rust core (price-contour-core) has no Python dependencies and can be tested independently with cargo test. The PyO3 crate (price-contour) is a thin binding layer that converts between Polars DataFrames and the internal QuoteGrid representation with zero-copy where possible.


Development

# Clone
git clone https://github.com/PricingFrontier/price-contour.git
cd price-contour

# Install in development mode (compiles Rust, links Python)
uv sync --all-groups
maturin develop

# Run Rust tests
cargo test

# Run Python tests
pytest

# Rebuild after Rust changes
maturin develop

Requirements: Rust toolchain (stable), Python 3.10+, maturin.


API reference

OnlineOptimiser

Method Description
solve(df_or_grid, *, lambdas=None) Run full optimisation. Returns SolveResult.
frontier(df_or_grid, *, threshold_ranges, n_points_per_dim=10, initial_lambdas=None) Sweep the efficient frontier. Returns FrontierResult.
summary(result) Package result into MLflow-ready params, metrics, artifacts dicts.
config_dict() Serialisable solver configuration.

RatebookOptimiser

Method Description
solve(df_or_grid, factors, *, factor_columns=None, lambdas=None) Run ratebook optimisation via coordinate descent. Returns RatebookResult.
frontier(df_or_grid, factors, *, threshold_ranges, n_points_per_dim=5, factor_columns=None, initial_lambdas=None) Sweep the efficient frontier via coordinate descent at each threshold. Returns FrontierResult.
summary(result) Package result into MLflow-ready dicts.

ApplyOptimiser

Method Description
apply(df) Single-pass scoring with fixed lambdas. Returns ApplyResult.
save(path) Save config + lambdas to JSON.
ApplyOptimiser.load(path) Load from saved JSON.

QuoteGridBuilder

Method Description
append(df) Add a chunk of quotes.
build() Finalise and return a QuoteGrid.

SolveResult

Property Type Description
converged bool Whether the solver converged.
iterations int Number of iterations taken.
lambdas dict[str, float] Final Lagrange multipliers (shadow prices) per constraint.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
dataframe pl.DataFrame Per-quote results with optimal scenario values.
history list[dict] | None Per-iteration convergence records (if record_history=True).
n_quotes int Number of quotes in the grid.
n_steps int Number of scenario value steps.
scenario_values list[float] The scenario value grid.
grid QuoteGrid The internal grid (reusable for subsequent solves or apply).

ApplyResult

Property Type Description
total_objective float Portfolio-level objective.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
lambdas dict[str, float] Applied Lagrange multipliers.
dataframe pl.DataFrame Per-quote results with optimal scenario values.

FrontierResult

Property Type Description
points pl.DataFrame One row per frontier point with threshold_*, total_objective, total_*, lambda_*, iterations, converged, and scenario value statistics (sv_mean, sv_std, sv_min, sv_p5sv_p95, sv_max, sv_pct_increase, sv_pct_decrease).
n_points int Number of frontier points.

RatebookResult

Property Type Description
factor_tables dict[str, dict[str, float]] Factor name to level-value mapping.
lambdas dict[str, float] Final Lagrange multipliers.
total_objective float Portfolio-level objective at optimal solution.
total_constraints dict[str, float] Portfolio-level constraint totals.
baseline_objective float Objective at scenario_value = 1.0.
baseline_constraints dict[str, float] Constraints at scenario_value = 1.0.
converged bool Whether coordinate descent converged.
cd_iterations int Coordinate descent iterations.
clamp_rate float Fraction of remappings that hit a grid boundary.
per_factor_results list[GroupedSolveResult] Per-factor inner solve results.
save(path) Save factor tables to a directory (one JSON per factor).
to_rating_entries() dict[str, pl.DataFrame] Convert to rating-step DataFrames.

Utility functions

Function Description
build_grid_from_parquet(path, *, constraint_columns, ...) Build a QuoteGrid directly from a Parquet file without materialising a DataFrame in Python.
apply_from_grid(grid, lambdas, constraints, *, chunk_size=500_000) Single-pass Lagrangian apply on an existing QuoteGrid. Returns ApplyResult.
frontier_summary(frontier_result, selected_index) Package a frontier result into MLflow-ready params, metrics, artifacts dicts.

License

Price Contour is licensed under the GNU Affero General Public License v3.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

price_contour-0.2.5-cp313-cp313-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.13Windows x86-64

price_contour-0.2.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

price_contour-0.2.5-cp313-cp313-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

price_contour-0.2.5-cp313-cp313-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.13macOS 10.12+ x86-64

price_contour-0.2.5-cp312-cp312-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.12Windows x86-64

price_contour-0.2.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

price_contour-0.2.5-cp312-cp312-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

price_contour-0.2.5-cp312-cp312-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.12macOS 10.12+ x86-64

price_contour-0.2.5-cp311-cp311-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.11Windows x86-64

price_contour-0.2.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.1 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

price_contour-0.2.5-cp311-cp311-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

price_contour-0.2.5-cp311-cp311-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

price_contour-0.2.5-cp310-cp310-win_amd64.whl (6.4 MB view details)

Uploaded CPython 3.10Windows x86-64

price_contour-0.2.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

price_contour-0.2.5-cp310-cp310-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

price_contour-0.2.5-cp310-cp310-macosx_10_12_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.10macOS 10.12+ x86-64

File details

Details for the file price_contour-0.2.5-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 f257748c7ccb49de8a619684ee4748823ee86920709e08defdcc3766ee2a692a
MD5 2fb156cee778072b8eab8b31340c0f81
BLAKE2b-256 27c36f5c2f6c7ffba2acf0af1385b5b213bef765adfa082901df461d1e3e4ce7

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0ecf9f71613f9fa600a369c2112618ec5b389896afd6f0f7c4a8bf9d6092486d
MD5 cd41acb73e21e94ce61d18259abfe991
BLAKE2b-256 3718de8b051c524adb365aff0777e498c1e5008a6f3faf472449c95b8c5b787a

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 476dde69a69875418bb2ba84d8608970626da64c9e44408fd752cd41dec8e044
MD5 4a19d093f130605432613796a23d49a2
BLAKE2b-256 290dac4aa205996e6b55d0c7056315d2d6b8740437845b4e26f35e2972cf3b05

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp313-cp313-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp313-cp313-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 8cceef1b12f30e02623f69bc27f60195b4bf5fa376535090ea3adf7d99db5a2f
MD5 53754ed996116892493eeaabb1749e7d
BLAKE2b-256 e9c0fa8717120357019006f23416fac76df6a6a0b2cc0a9f5f9ff5ee6d8a2577

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 b1c86670d80be87faf08263424007e9d548f4218bb53240f2decbf9456731cf5
MD5 97c9488605e81796d04486833889d6d8
BLAKE2b-256 b7f5deaceff3114f2424b537a257bd331ae0faf00fd912d519dc011add3600bb

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 99e3a6dee653c96d6b642c03bbea13a42673ace95ad56213bb8f973c7e3dc8f4
MD5 ec401284130908964672d2c6021b12b5
BLAKE2b-256 20db180b278e78008a854fc71d2b9de13a3789caae81f50cdba97a4aa7b31785

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 533a7302287075011dfa078d708a387612deb59a69b620d3f8a702de949c8818
MD5 1ac819323d4bfaee2de8e12821a8b4ce
BLAKE2b-256 e24707ca1eef17b542b8aa72d623850369d99f46c9e05229058a092346bc0192

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp312-cp312-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp312-cp312-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 a352d9b50e709ecf4be93fa77f32948bb6c14668bc5175471beaf04b78c92a6a
MD5 36e7a532a83764ccd8af47dfef3abe0a
BLAKE2b-256 a296facaa9de4c5f3a0203e4a2c798abf701499c5ab60f4e5c6acb4946a2b8c6

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 615f678943aad2abd77a926f2ac3470e999805bd212bc9503f70e398359064ad
MD5 c6ada39b601213a4a0fae85ac02a1892
BLAKE2b-256 2ddc2985a73c1c8de60ae24770d9237500311b41c420fa50b01970b7711e4bd5

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 766213db959d8467ced4cf85b4dcf1240f4f343ec8263ebdcaa6dee92071c865
MD5 5663cf4cdf364242e1f1a5897f1a7ce6
BLAKE2b-256 732ed915205a38e92306ef0510fedf623a57052a031ff6a0298797e22c57e5d9

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 39c6c36d755f2c129d55834ef78d0c80fdfc383ab4849a3b788936e5cd96b9ac
MD5 17bcb023ff15727482e6191bc3074778
BLAKE2b-256 691ff576ca5a97ed2d3e37360de99980b61a2682b4eedc2f203a79fb4d3bd54c

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 c3f6f18915d81b8dd4388b36721788f2d0e4295d1e9c3ca10cebf6ff3729bf09
MD5 448f106404f3424d9c2b03eea96181f2
BLAKE2b-256 bdeacb8a9ed96fdbf7f69730dd93f90d19515613a7358109e5217cf8918671b4

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 5f72b10616f3133d087b192c6f94dc2d7f2cc3e322f5fbc4a614199e8745190a
MD5 72fffa3922a8eb2498c0dfef9c6ea644
BLAKE2b-256 8401be15e08e19b793300cb08af2420ce44fb3f3906a32fdfd795ce4c593ba56

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 acb3f68f31f5e9b0910b178dde0f5af6d1bf133ee98f390240f9fa3fc53a25b4
MD5 123a9ed4b94af27c9b41ed33fed6f2fc
BLAKE2b-256 a5d6f3119aad4088f896096e32c6260c1cd6a49282bcbd39de7d864aa5601d1d

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0dc8c94074c642d39a067e4ea1d8c103c7d2b67b3f465db105e356c1159a6b2d
MD5 955c8bed9751d70297e94e22113ae5a8
BLAKE2b-256 d363ab95ba529debeae32e9137a0ff5da0f77118c847099e9aefe1a0f781724e

See more details on using hashes here.

File details

Details for the file price_contour-0.2.5-cp310-cp310-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for price_contour-0.2.5-cp310-cp310-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 208e6dcca385a6cfcb382e43618210dba17da4e0335a9df58ce3f417e3259281
MD5 2c626c7f93033f7357c6ca69c46900f8
BLAKE2b-256 c82cd7c862d6eb84e2429e2c8186283ec33fa50eb4f50107ce245d00c6da33b3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page