Skip to main content

A GPU-accelerated gradient boosting library using Conditional Inference Trees.

Project description

CTBoost

CTBoost is a gradient boosting library built around Conditional Inference Trees, with a native C++17 core, Python bindings via pybind11, optional CUDA support for source builds, and an optional scikit-learn style API.

The current codebase supports end-to-end training and prediction for regression, classification, grouped ranking, and survival, plus pandas and SciPy sparse ingestion without dense expansion, row weights and class imbalance controls, explicit missing-value handling, configurable validation metrics, stable JSON model persistence, staged prediction, warm-start continuation, a native C++ feature pipeline for categorical/text/embedding transforms with thin Python wrappers, and a built-in cross-validation helper.

Current Status

  • Language mix: Python + C++17, with optional CUDA
  • Python support: 3.8 through 3.14
  • Packaging: scikit-build-core
  • CI/CD: GitHub Actions for CMake validation and cibuildwheel release builds
  • Repository version: 0.1.25
  • Status: actively evolving native + Python package

What Works Today

  • Native gradient boosting backend exposed as ctboost._core
  • Pool abstraction for dense tabular data, SciPy sparse input, categorical feature indices, and optional group_id
  • Native pandas DataFrame and Series support
  • Automatic categorical detection for pandas category and object columns
  • Regression training with ctboost.train(...), including raw array/DataFrame inputs plus optional preprocessing and external-memory staging
  • scikit-learn compatible CTBoostClassifier, CTBoostRegressor, and CTBoostRanker when scikit-learn is installed
  • Binary and multiclass classification
  • Grouped ranking with PairLogit and NDCG
  • Row weights through Pool(..., weight=...) and sample_weight on sklearn estimators
  • Class imbalance controls through class_weight, class_weights, auto_class_weights="balanced", and scale_pos_weight
  • Explicit missing-value handling through nan_mode
  • Quantization controls through max_bins, max_bin_by_feature, border_selection_method, feature_borders, and nan_mode_by_feature
  • Row subsampling through subsample plus bootstrap_type="No"|"Bernoulli"|"Poisson"
  • Bayesian bagging through bootstrap_type="Bayesian" plus bagging_temperature
  • boosting_type="RandomForest" on top of the existing conditional-inference tree learner
  • boosting_type="DART" with dropout-style tree normalization on top of the existing conditional-inference tree learner
  • Monotonic constraints through monotone_constraints
  • Path-level interaction constraints through interaction_constraints
  • Additional generic regularization and tree-growth controls through feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight
  • GPU tree growth now also supports monotonic constraints, interaction constraints, feature_weights, first_feature_use_penalties, random_strength, and grow_policy="LeafWise" without replacing the conditional-inference split gate
  • Survival objectives: Cox, SurvivalExponential
  • Survival evaluation through CIndex
  • Early stopping with eval_set and early_stopping_rounds
  • Separate eval_metric support for validation history and early stopping
  • Validation loss/metric history and evals_result_
  • Per-iteration prediction through staged prediction and num_iteration
  • Stable JSON and pickle model persistence for low-level boosters and scikit-learn style estimators
  • Cross-validation with ctboost.cv(...) when scikit-learn is installed
  • Regression objectives: RMSE, MAE, Huber, Quantile, Poisson, Tweedie
  • Generic eval metrics including RMSE, MAE, Poisson, Tweedie, Accuracy, Precision, Recall, F1, AUC, NDCG, MAP, MRR, and CIndex
  • Native ctboost.FeaturePipeline logic in _core.NativeFeaturePipeline, with low-level and sklearn integration for ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding-stat expansion
  • Generic categorical controls around the existing conditional tree learner: one_hot_max_size / max_cat_to_onehot, max_cat_threshold, simple_ctr, combinations_ctr, and per_feature_ctr
  • ctboost.prepare_pool(...) for low-level raw-data preparation, optional feature-pipeline fitting, and disk-backed external-memory pool staging
  • Native CPU out-of-core fit through ctboost.train(..., external_memory=True), which now spills quantized feature-bin columns to disk instead of keeping the full histogram matrix resident in RAM
  • Multi-host distributed training through distributed_world_size, distributed_rank, distributed_root, and distributed_run_id, with a native per-node histogram reduction path and a TCP collective backend available through distributed_root="tcp://host:port"
  • Distributed eval_set, early_stopping_rounds, init_model, grouped ranking shards, and sklearn-estimator wrappers on the TCP collective backend
  • Distributed GPU training when CUDA is available and distributed_root uses the TCP collective backend
  • Feature importance reporting
  • Leaf-index introspection and path-based prediction contributions
  • Continued training through init_model and estimator warm_start
  • Build metadata reporting through ctboost.build_info()
  • CPU builds on standard CI runners
  • Optional CUDA compilation when building from source with a suitable toolkit
  • GPU source builds now keep fit-scoped histogram data resident on device, support shared-memory histogram accumulation, and expose GPU raw-score prediction for regression, binary classification, and multiclass models
  • Histogram building now writes directly into final-width compact storage when the fitted schema permits <=256 bins, avoiding the old transient uint16 -> uint8 duplication spike
  • Fitted models now store quantization metadata once per booster instead of duplicating the same schema in every tree
  • Low-level boosters can export reusable fitted borders through Booster.get_borders() and expose the full shared quantization schema through Booster.get_quantization_schema()
  • GPU fit now drops the host training histogram bin matrix immediately after the device histogram workspace has been created and warm-start predictions have been seeded
  • GPU tree building now uses histogram subtraction in the device path as well, so only one child histogram is built explicitly after each split
  • GPU node search now keeps best-feature selection on device and returns a compact winner instead of copying the full per-feature search buffer back to host each node
  • Training can emit native histogram/tree timing via verbose=True or CTBOOST_PROFILE=1

Current Limitations

  • Ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding expansion now run through a native C++ pipeline, while pandas extraction, raw-data routing, and Pool orchestration remain thin Python glue
  • There is now a native sparse training path plus disk-backed quantized-bin staging through ctboost.train(..., external_memory=True) on both CPU and GPU, and distributed training can also use a standalone TCP collective coordinator through distributed_root="tcp://host:port"
  • The legacy filesystem-based distributed path still exists for basic CPU shard training, but distributed eval_set support and distributed GPU execution require the TCP backend
  • Distributed multi-host training still expects already-prepared numeric features when distributed_world_size > 1; fitting the raw feature pipeline itself is not yet coordinated across ranks
  • Distributed grouped/ranking training requires each group_id to live entirely on one worker shard; cross-rank query groups are rejected
  • Dedicated GPU wheel automation targets Linux x86_64 CPython 3.10 through 3.14 release assets for Kaggle-style environments
  • CUDA wheel builds in CI depend on container-side toolkit provisioning

Resolved Fold-Memory Hotspots

The older v0.1.15 GPU fit-memory bottleneck list is now closed in the current tree:

  • Quantization metadata is stored once per fitted booster and shared by all trees instead of being duplicated per tree
  • GPU fit releases the host training histogram bin matrix immediately after device workspace creation and warm-start seeding
  • GPU tree growth uses histogram subtraction, so only one child histogram is built explicitly after a split
  • GPU split search keeps best-feature selection on device and copies back only the winning feature summary

That means the old per-node GPU bin-materialization issue is no longer the main resident-memory problem in the current codebase. The remaining generic backlog is now in broader distributed runtime ergonomics, full GPU parity for every control surface, and production tooling.

Benchmark Snapshot

The heavy ordered-target-encoding playground-series-s6e4 replay was last measured on April 12, 2026 with the v0.1.11 source tree. The one-fold Kaggle source-build replay completed successfully with:

  • build 55.41s
  • fold preprocess 57.17s
  • fold fit 2107.10s
  • fold predict 5.89s
  • fold total 2170.17s
  • validation score 0.973213

Since that replay, the source tree has removed additional fit-memory overhead by sharing quantization schema per model, building compact train bins without a second host copy, releasing host train-bin storage after GPU upload, and adding GPU histogram subtraction plus device-side best-feature reduction.

Installation

For local development or source builds:

pip install .

Install development dependencies:

pip install -e .[dev]

Install the optional scikit-learn wrappers and ctboost.cv(...) support:

pip install -e .[sklearn]

Wheels vs Source Builds

pip install ctboost works without a compiler only when PyPI has a prebuilt wheel for your exact Python/OS tag. If no matching wheel exists, pip falls back to the source distribution and has to compile the native extension locally.

The release workflow is configured to publish CPU wheels for current CPython releases on Windows and Linux, plus macOS x86_64 CPU wheels for CPython 3.10 through 3.14, so standard pip install ctboost usage does not depend on a local compiler.

Each tagged GitHub release also attaches the CPU wheels, the source distribution, and dedicated Linux x86_64 CUDA wheels for CPython 3.10 through 3.14. The GPU wheel filenames carry a 1gpu build tag so the release can publish CPU and GPU artifacts for the same Python and platform tags without filename collisions.

The GPU release job installs the CUDA compiler plus the CUDA runtime development package, exports the toolkit paths into the build environment, and sets CTBOOST_REQUIRE_CUDA=ON so the wheel build fails instead of silently degrading to a CPU-only artifact. The release smoke test also checks that ctboost.build_info()["cuda_enabled"] is True before the GPU wheel is uploaded.

Kaggle GPU Install

pip install ctboost still resolves to the CPU wheel on PyPI. On Kaggle, install the matching GPU release wheel from GitHub instead:

import json
import subprocess
import sys
import urllib.request

tag = "v0.1.25"
py_tag = f"cp{sys.version_info.major}{sys.version_info.minor}"
api_url = f"https://api.github.com/repos/captnmarkus/ctboost/releases/tags/{tag}"

with urllib.request.urlopen(api_url) as response:
    release = json.load(response)

asset = next(
    item
    for item in release["assets"]
    if item["name"].endswith(".whl") and f"-1gpu-{py_tag}-{py_tag}-" in item["name"]
)

subprocess.check_call(
    [sys.executable, "-m", "pip", "install", "-U", asset["browser_download_url"]]
)

After installation, confirm the wheel really contains CUDA support:

import ctboost

info = ctboost.build_info()
if not info["cuda_enabled"]:
    raise RuntimeError(f"Expected a CUDA-enabled CTBoost wheel, got: {info}")
print(info)

CPU-Only Source Build

To force a CPU-only native build:

CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF" pip install .

On PowerShell:

$env:CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF"
pip install .

Windows source builds require a working C++ toolchain. In practice that means Visual Studio Build Tools 2022 or a compatible MSVC environment, plus CMake. ninja is recommended, but it does not replace the compiler itself.

CUDA Source Build

CTBoost can compile a CUDA backend when the CUDA toolkit and compiler are available. CUDA is enabled by default in CMake, but the build automatically falls back to CPU-only when no toolkit is detected.

pip install .

You can inspect the compiled package after installation:

import ctboost
print(ctboost.build_info())

Quick Start

scikit-learn Style Classification

import pandas as pd
from sklearn.datasets import make_classification

from ctboost import CTBoostClassifier

X, y = make_classification(
    n_samples=256,
    n_features=8,
    n_informative=5,
    n_redundant=0,
    random_state=13,
).astype("float32")
X = pd.DataFrame(X, columns=[f"f{i}" for i in range(X.shape[1])])
X["segment"] = pd.Categorical(["a" if i % 2 == 0 else "b" for i in range(len(X))])
y = y.astype("float32")

model = CTBoostClassifier(
    iterations=256,
    learning_rate=0.1,
    max_depth=3,
    alpha=1.0,
    lambda_l2=1.0,
    task_type="CPU",
)

model.fit(
    X.iloc[:200],
    y[:200],
    eval_set=[(X.iloc[200:], y[200:])],
    early_stopping_rounds=20,
)
proba = model.predict_proba(X)
pred = model.predict(X)
importance = model.feature_importances_
best_iteration = model.best_iteration_

Low-Level Training API

import numpy as np

import ctboost

X = np.array([[0.0, 1.0], [1.0, 0.0], [0.5, 0.5]], dtype=np.float32)
y = np.array([0.0, 1.0, 0.5], dtype=np.float32)

pool = ctboost.Pool(X, y)
booster = ctboost.train(
    pool,
    {
        "objective": "Huber",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 64,
        "huber_delta": 1.5,
        "eval_metric": "MAE",
        "nan_mode": "Min",
        "task_type": "CPU",
    },
    num_boost_round=10,
)

predictions = booster.predict(pool)
loss_history = booster.loss_history
eval_loss_history = booster.eval_loss_history
exported_borders = booster.get_borders()

Per-feature quantization controls are available on the same low-level API:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 128,
        "max_bin_by_feature": {0: 16, 1: 8},
        "border_selection_method": "Uniform",
        "feature_borders": {1: [-0.5, 0.0, 0.5]},
        "nan_mode_by_feature": {0: "Max"},
    },
    num_boost_round=32,
)

feature_borders lets selected numeric features reuse explicit cut values, max_bin_by_feature overrides the global max_bins budget per column, border_selection_method currently supports Quantile and Uniform, and Booster.get_borders() returns an importable border bundle keyed by fitted feature index.

The same low-level API also exposes generic regularization and growth controls around the existing conditional tree learner:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 4,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "bootstrap_type": "Bayesian",
        "bagging_temperature": 1.0,
        "feature_weights": {0: 2.0, 3: 0.5},
        "first_feature_use_penalties": {2: 1.5},
        "random_strength": 0.2,
        "grow_policy": "LeafWise",
        "max_leaves": 16,
        "min_samples_split": 8,
        "max_leaf_weight": 2.0,
    },
    num_boost_round=64,
)

feature_weights rescales feature preference without replacing the conditional test, first_feature_use_penalties discourages the first use of selected features at the model level, random_strength adds seeded noise to break near-ties in split gain after the conditional gate has already accepted a candidate, and grow_policy="LeafWise" currently means a best-child-first heuristic under the existing max_leaves budget rather than a separate split criterion.

The same low-level API can now prepare raw categorical/text/embedding inputs directly:

import numpy as np
import ctboost

X = np.empty((4, 4), dtype=object)
X[:, 0] = ["berlin", "paris", "berlin", "rome"]
X[:, 1] = [1.0, 2.0, 1.5, 3.0]
X[:, 2] = ["red fox", "blue fox", "red hare", "green fox"]
X[:, 3] = [
    np.array([0.1, 0.4, 0.2], dtype=np.float32),
    np.array([0.7, 0.1, 0.3], dtype=np.float32),
    np.array([0.2, 0.5, 0.6], dtype=np.float32),
    np.array([0.9, 0.2, 0.4], dtype=np.float32),
]
y = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

booster = ctboost.train(
    X,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "one_hot_max_size": 4,
        "max_cat_threshold": 16,
        "cat_features": [0],
        "simple_ctr": ["Mean", "Frequency"],
        "per_feature_ctr": {0: ["Mean"]},
        "text_features": [2],
        "embedding_features": [3],
    },
    label=y,
    num_boost_round=32,
)

raw_predictions = booster.predict(X)

For disk-backed pool staging on large folds:

pool = ctboost.prepare_pool(
    X_numeric,
    y,
    external_memory=True,
    external_memory_dir="ctboost-cache",
)

booster = ctboost.train(
    X_numeric,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "external_memory": True,
        "external_memory_dir": "ctboost-cache",
    },
    label=y,
    num_boost_round=64,
)

Working With Categorical Features

Categorical columns can still be marked manually through the Pool API:

import numpy as np
import ctboost

X = np.array([[0.0], [1.0], [2.0], [3.0]], dtype=np.float32)
y = np.array([1.0, 0.0, 1.0, 0.0], dtype=np.float32)

pool = ctboost.Pool(X, y, cat_features=[0])

For pandas inputs, categorical/object columns are detected automatically:

import pandas as pd
import ctboost

frame = pd.DataFrame(
    {
        "value": [1.0, 2.0, 3.0, 4.0],
        "city": pd.Categorical(["berlin", "paris", "berlin", "rome"]),
        "segment": ["retail", "enterprise", "retail", "enterprise"],
    }
)
label = pd.Series([0.0, 1.0, 0.0, 1.0], dtype="float32")

pool = ctboost.Pool(frame, label)
assert pool.cat_features == [1, 2]

For estimator-side ordered CTRs, categorical crosses, one-hot expansion, rare-category bucketing, text hashing, and embedding expansion, use the Python feature pipeline parameters:

import numpy as np
import pandas as pd

from ctboost import CTBoostRegressor

frame = pd.DataFrame(
    {
        "city": ["berlin", "paris", "berlin", "rome"],
        "headline": ["red fox", "blue fox", "red hare", "green fox"],
        "embedding": [
            np.array([0.1, 0.4, 0.2], dtype=np.float32),
            np.array([0.7, 0.1, 0.3], dtype=np.float32),
            np.array([0.2, 0.5, 0.6], dtype=np.float32),
            np.array([0.9, 0.2, 0.4], dtype=np.float32),
        ],
        "value": [1.0, 2.0, 1.5, 3.0],
    }
)
label = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

model = CTBoostRegressor(
    iterations=32,
    learning_rate=0.1,
    max_depth=3,
    ordered_ctr=True,
    one_hot_max_size=8,
    max_cat_threshold=32,
    cat_features=["city"],
    categorical_combinations=[["city", "headline"]],
    simple_ctr=["Mean", "Frequency"],
    per_feature_ctr={"city": ["Mean"]},
    text_features=["headline"],
    embedding_features=["embedding"],
)
model.fit(frame, label)

one_hot_max_size keeps low-cardinality categoricals as explicit indicator columns, max_cat_threshold buckets higher-cardinality levels down to a capped native categorical domain before the conditional tree learner sees them, and per_feature_ctr lets specific base features or categorical combinations opt into CTR generation without changing the underlying conditional split logic.

Model Persistence, Warm Start, And Cross-Validation

import ctboost

booster.save_model("regression-model.json")
restored = ctboost.load_model("regression-model.json")
restored_predictions = restored.predict(pool)

continued = ctboost.train(
    pool,
    {"objective": "RMSE", "learning_rate": 0.2, "max_depth": 2, "alpha": 1.0, "lambda_l2": 1.0},
    num_boost_round=10,
    init_model=restored,
)

cv_result = ctboost.cv(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
    },
    num_boost_round=25,
    nfold=3,
)

The scikit-learn compatible estimators also expose:

  • save_model(...)
  • load_model(...)
  • staged_predict(...)
  • staged_predict_proba(...) for classifiers
  • predict_leaf_index(...)
  • predict_contrib(...)
  • evals_result_
  • best_score_
  • sample_weight on fit(...)
  • class_weight, scale_pos_weight, eval_metric, nan_mode, nan_mode_by_feature, and warm_start
  • max_bins, max_bin_by_feature, border_selection_method, and feature_borders
  • bagging_temperature, feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight

Public Python API

The main entry points are:

  • ctboost.Pool
  • ctboost.FeaturePipeline
  • ctboost.prepare_pool
  • ctboost.train
  • ctboost.cv
  • ctboost.Booster
  • ctboost.CTBoostClassifier
  • ctboost.CTBoostRanker
  • ctboost.CTBoostRegressor
  • ctboost.CBoostClassifier
  • ctboost.CBoostRanker
  • ctboost.CBoostRegressor
  • ctboost.build_info
  • ctboost.load_model

Build and Test

Run the test suite:

pytest tests

The latest local release-candidate validation on April 13, 2026 was:

python -m pytest -q

Build an sdist:

python -m build --sdist

Configure and build the native extension directly with CMake:

python -m pip install pybind11 numpy pandas scikit-learn pytest
cmake -S . -B build -DCTBOOST_ENABLE_CUDA=OFF -Dpybind11_DIR="$(python -m pybind11 --cmakedir)"
cmake --build build --config Release --parallel

Wheel builds are configured through cibuildwheel for:

  • Windows amd64
  • Linux x86_64 and aarch64 using the current manylinux baseline
  • macOS x86_64
  • CPython 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, and 3.14

GitHub Actions workflows:

  • .github/workflows/cmake.yml: configures, builds, installs, and tests CPU builds on Ubuntu, Windows, and macOS for pushes and pull requests
  • .github/workflows/publish.yml: builds release wheels and the sdist, runs wheel smoke tests on built artifacts, publishes CPU wheels to PyPI, and attaches both CPU and Linux GPU wheels to tagged GitHub releases

The standard PyPI release wheel workflow builds CPU-only wheels by setting:

cmake.define.CTBOOST_ENABLE_CUDA=OFF

The Linux GPU release-wheel matrix enables CUDA separately with:

cmake.define.CTBOOST_ENABLE_CUDA=ON
cmake.define.CTBOOST_REQUIRE_CUDA=ON
cmake.define.CMAKE_CUDA_COMPILER=/usr/local/cuda-12.0/bin/nvcc
cmake.define.CUDAToolkit_ROOT=/usr/local/cuda-12.0
cmake.define.CMAKE_CUDA_ARCHITECTURES=60;70;75;80;86;89
wheel.build-tag=1gpu

Project Layout

ctboost/      Python API layer
include/      public C++ headers
src/core/     core boosting, objectives, trees, statistics
src/bindings/ pybind11 extension bindings
cuda/         optional CUDA backend
tests/        Python test suite

License

Apache 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctboost-0.1.25.tar.gz (294.8 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ctboost-0.1.25-cp314-cp314-win_amd64.whl (461.6 kB view details)

Uploaded CPython 3.14Windows x86-64

ctboost-0.1.25-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (645.1 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (580.7 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp314-cp314-macosx_10_15_x86_64.whl (492.7 kB view details)

Uploaded CPython 3.14macOS 10.15+ x86-64

ctboost-0.1.25-cp313-cp313-win_amd64.whl (449.1 kB view details)

Uploaded CPython 3.13Windows x86-64

ctboost-0.1.25-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (644.7 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (579.2 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp313-cp313-macosx_10_15_x86_64.whl (492.4 kB view details)

Uploaded CPython 3.13macOS 10.15+ x86-64

ctboost-0.1.25-cp312-cp312-win_amd64.whl (449.2 kB view details)

Uploaded CPython 3.12Windows x86-64

ctboost-0.1.25-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (644.6 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (579.4 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp312-cp312-macosx_10_15_x86_64.whl (492.3 kB view details)

Uploaded CPython 3.12macOS 10.15+ x86-64

ctboost-0.1.25-cp311-cp311-win_amd64.whl (448.8 kB view details)

Uploaded CPython 3.11Windows x86-64

ctboost-0.1.25-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (641.2 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (576.7 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp311-cp311-macosx_10_15_x86_64.whl (490.5 kB view details)

Uploaded CPython 3.11macOS 10.15+ x86-64

ctboost-0.1.25-cp310-cp310-win_amd64.whl (447.7 kB view details)

Uploaded CPython 3.10Windows x86-64

ctboost-0.1.25-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (638.8 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (574.3 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp310-cp310-macosx_10_15_x86_64.whl (489.1 kB view details)

Uploaded CPython 3.10macOS 10.15+ x86-64

ctboost-0.1.25-cp39-cp39-win_amd64.whl (453.3 kB view details)

Uploaded CPython 3.9Windows x86-64

ctboost-0.1.25-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (637.8 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (572.6 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.25-cp38-cp38-win_amd64.whl (447.5 kB view details)

Uploaded CPython 3.8Windows x86-64

ctboost-0.1.25-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (636.9 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.25-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (572.4 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

File details

Details for the file ctboost-0.1.25.tar.gz.

File metadata

  • Download URL: ctboost-0.1.25.tar.gz
  • Upload date:
  • Size: 294.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25.tar.gz
Algorithm Hash digest
SHA256 26331bc6f20e2a2b0d63ea3a2d68036766947211609aa59a061cd80f5f698c63
MD5 d521a55e07be395c0634d0103740177b
BLAKE2b-256 9cea4db1cab91c13bca1f89bf18163db56e8abb3dd4201d3b3d5d1c6b43ab205

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25.tar.gz:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 461.6 kB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 fc62b86257b27f8bbe0769990c81c4b824ba77ebd774fa1b760c125b8600c89e
MD5 61c0789809e207308b273151cc6e875d
BLAKE2b-256 50face56ce5ad083da13c194cb9f6a84d5389e0727b3e17845b18894458151d6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp314-cp314-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5e76f0f89faf872f351bfea37b20a5c91d92fb109d20d6e58b6c69f7262df831
MD5 d681c7f05f5ee8a4c3cb9abd5ddb8714
BLAKE2b-256 c73deee3a94a6c7fbbdf37a426c8f37d94a16391c922fd3cade537c6f6bb71f7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e499b2c1b8500e9c8aabece44cc9f232d1d8f6a8011cfe0c7eb8768da911b8b0
MD5 7a644c6249fa58ce056387845cc6d50a
BLAKE2b-256 5d1e3ebf11fa4a12f934aa2c99ebab4a9c669be57b3a4608b2a5e763da083ba9

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp314-cp314-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp314-cp314-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 fecabb3a4039cf26eab7caaf6bbd3af00bc8f5557fa49f73c8cb54a247944f57
MD5 02303b531a9fae2ea02444a98f0984c9
BLAKE2b-256 14e3cc11239306facce4fb6828b0d01e01a97dff22793c30fbf99336f5f90e4a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp314-cp314-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 449.1 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 803b615032b6cd4d30f7c19a070b9f04f3e17d8aa10a1fdcf1b2345c0e9c6d67
MD5 c622e8dee7bdcf36120a01fdece27266
BLAKE2b-256 52859df579df170f6ac7030264abedbc0b384580260939f6db8069d6fc164cad

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ec02f2f658ba67e55ab17700e179a0b9b9109aa1f85b79fa64630486cee0299a
MD5 b86768bcb3ea82ca185863e06d76ed55
BLAKE2b-256 06528ea0d84b93040546f8d5a2f118b1414a9659f2278f250b444aa9fff46239

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 deee2c6ad7191be6abf91de262c1ae87016b3fe918db3887f536534deea76cd5
MD5 21e003435ca8b2adfaf705353ee3ce6b
BLAKE2b-256 7bbb6d726abeb7ddaf61af78656ff0733b208776ab389f38afa5d1e6ae8364a3

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp313-cp313-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp313-cp313-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 6c57a432910b9dce0360d02b6d22b5996253f5efc138285892896e95dff04088
MD5 e718baf9124552109ee598e122b8e90c
BLAKE2b-256 ee04e00c1b9a7586ec032b10ebaa62ce67247798cba7b4dd0555adf12e66d1e0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp313-cp313-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 449.2 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 101dd92b872281ebde5e473f38af10477687dc94053ed01111e31046422d05cc
MD5 8468a3ef98d43b61a811312408fb197a
BLAKE2b-256 f5a10cf5b8e8d20024091929fd7298fe3d37765e34296bab52de6bfc7d7031d0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 7b0c1aad68c07fb12ec613b3978828ff9d71cb64e2749de4cbd88181fac778e5
MD5 c9b1165b2353f3f974197fb92c7ab8b4
BLAKE2b-256 e7a548efa35ddf5cc525b0d1cfe875932f099b6b6852a348185a8b014853c3fa

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 4f8de1881b50891ef53f9a0f0e254cdf74adc823c2c1be4c942c6f211d93c412
MD5 8f58ac57cc418385fef42705fdc0edc6
BLAKE2b-256 b2b11384547c3cc624f05caa9388505a60352578860f8ce709c5995fff430c02

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp312-cp312-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp312-cp312-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 870ff8edf7fd802f850e69bab343c3256d39da4b128ba73659406f6e11294e74
MD5 870851143bea21abce3d7a8d623bd6fe
BLAKE2b-256 d5945d57318b0a54e6c0e343ae536e1bc7d874541aea296936659705b652aa0b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp312-cp312-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 448.8 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 8248779f7169a9feade1f51529ba1e39e8589138028cf260f7e2f84483d5c30c
MD5 e520b886d63de4fa30feded653813938
BLAKE2b-256 670edd3e52c1842163c43e6988bd6db066ef538b0b3721ebf804eaf915568845

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 4b320389efe7e89eb49cbacb39b43ce98e77e0c491e2ea848f31f2e672744b66
MD5 e6b89c60ef699653b270499714c1d5c5
BLAKE2b-256 3ba891db9fa7b6449ba5d86003f96fb4cf39a008dc3ed6081db0f1ac367fa06d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 4dd861a5d376455213f3fbcfd39e02c48a1d6faa1aa0ddb9ad0b5fcded9364b2
MD5 d5c3b637f882310393219757192e09f9
BLAKE2b-256 d80362f12480741352dd535f3e90e3aa76844df18505b6e615608ade3fb08b79

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp311-cp311-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp311-cp311-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 183089fccb0bb50b07a3df1dfca733ceb41d143486ca9cd21481967c0ec2e6a9
MD5 e1dd8221a318b2e597032031521ff7be
BLAKE2b-256 6739e976ac654f342ff7941d79dd804eaaccbf248e0589666a88dab759fe894c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp311-cp311-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 447.7 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 5f4a358a0ed495e6547dc8fcf2b9263d26a7cb080e16666b11e1868a422430c4
MD5 cf6fde9ebea8f17ef87ac83c9ccaea37
BLAKE2b-256 78a034fc1200bd9ab7c22fc04c16a2f1aea7142f8dc089d901afc21a93c8f6ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d8a613c270cf3f23245f5871754131eccb11aeede81ce4732ea02fd6bed930a6
MD5 d7f4dcae36260fbf9b9d5c6131416659
BLAKE2b-256 d212743a35a819b7965122bc921cfe0b9426e2594d1abaadc4bfd1d105a34d34

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 4c37f5485014d188281d61b777b114287702cbc793819a0b206333b54c78e690
MD5 b57203562f4cd4281033738d69e653a5
BLAKE2b-256 a9f47c4c5a635b67524d559ecb678806d7e9cd7c51000d01463c683d5f922d03

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp310-cp310-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp310-cp310-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 19f96d7f56e211bfbe55158a1190a1b32cf4f29ec3bbb634aeca3cf6cd3fe25e
MD5 afaa38023cca7e68a5522dc6e093dac8
BLAKE2b-256 8e0af2e62e241ddba572500f7dd3906aadbb317c3ebd02a14e7247fa82f84c9d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp310-cp310-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 453.3 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 47e5621e8298dfea4095dca05d363bdaea9fb6f42e190b7efd47d1dfcd33b37f
MD5 475519939add420697363085daf98d57
BLAKE2b-256 becb0f52f3c3cfa6007c106bbd69646072e1de57af4639353a4317ed48f85ac7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp39-cp39-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a21b8bafa55b1169b7330d1358a37796818960dd7149740a17c4cedb990a5a81
MD5 22e679622e65d943d9b070f4d416627f
BLAKE2b-256 aee8f802f5cabddac455f9b4792f02b3ff24a46002729ce8a6b14c97f211df41

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 1d9825289a9b111c38e241d1d86c5e72ba69b3e27b4236af9ccd57d4974de551
MD5 7c60b40436d556617a2e605fb4f32475
BLAKE2b-256 aedf05ab6d6216c41c4274691aaa8574d3a1319e9cd0c6a49abf4eaf65c9cbf9

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.25-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 447.5 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.25-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 83f01940ad84c8e3a927018380808594106c3484e67dae4f74ddcdfa48ca192d
MD5 77844c79c54118ef79c7dce2fc8b04c4
BLAKE2b-256 444b8e1ee7ee940eb50c6b4dd1268bb0ea9cebacf51776437a9d9bfacea2d7a2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp38-cp38-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 944fd3ae6a9c1ca3ae7a6239a0b2a0ba337bb91ef7b91d36cca8955963801e34
MD5 847c1cf51562c07ab47e02a393265ef4
BLAKE2b-256 87613894355cd73380538e83e462985a3e2a558d0b3f992368d15aeec9b35e0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.25-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.25-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 c30f83cf76d1b8165c33e0d747fc63667e5a75dac9ec770756f583364b1f2b55
MD5 efcf457ba91c02249c78b0753be707ee
BLAKE2b-256 2101f52a4861c8316d5267c5588eeb95ada2ed79bc22d6783138ceb30ea72ac2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.25-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page