Skip to main content

A GPU-accelerated gradient boosting library using Conditional Inference Trees.

Project description

CTBoost

CTBoost is a gradient boosting library built around Conditional Inference Trees, with a native C++17 core, Python bindings via pybind11, optional CUDA support for source builds, and an optional scikit-learn style API.

The current codebase supports end-to-end training and prediction for regression, classification, grouped ranking, and survival, plus pandas and SciPy sparse ingestion without dense expansion, row weights and class imbalance controls, explicit missing-value handling, configurable validation metrics, stable JSON model persistence, staged prediction, warm-start continuation, a native C++ feature pipeline for categorical/text/embedding transforms with thin Python wrappers, and a built-in cross-validation helper.

Current Status

  • Language mix: Python + C++17, with optional CUDA
  • Python support: 3.8 through 3.14
  • Packaging: scikit-build-core
  • CI/CD: GitHub Actions for CMake validation and cibuildwheel release builds
  • Repository version: 0.1.26
  • Status: actively evolving native + Python package

What Works Today

  • Native gradient boosting backend exposed as ctboost._core
  • Pool abstraction for dense tabular data, SciPy sparse input, categorical feature indices, and optional group_id
  • Native pandas DataFrame and Series support
  • Automatic categorical detection for pandas category and object columns
  • Regression training with ctboost.train(...), including raw array/DataFrame inputs plus optional preprocessing and external-memory staging
  • scikit-learn compatible CTBoostClassifier, CTBoostRegressor, and CTBoostRanker when scikit-learn is installed
  • Binary and multiclass classification
  • Grouped ranking with PairLogit and NDCG
  • Row weights through Pool(..., weight=...) and sample_weight on sklearn estimators
  • Class imbalance controls through class_weight, class_weights, auto_class_weights="balanced", and scale_pos_weight
  • Explicit missing-value handling through nan_mode
  • Quantization controls through max_bins, max_bin_by_feature, border_selection_method, feature_borders, and nan_mode_by_feature
  • Row subsampling through subsample plus bootstrap_type="No"|"Bernoulli"|"Poisson"
  • Bayesian bagging through bootstrap_type="Bayesian" plus bagging_temperature
  • boosting_type="RandomForest" on top of the existing conditional-inference tree learner
  • boosting_type="DART" with dropout-style tree normalization on top of the existing conditional-inference tree learner
  • Monotonic constraints through monotone_constraints
  • Path-level interaction constraints through interaction_constraints
  • Additional generic regularization and tree-growth controls through feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight
  • GPU tree growth now also supports monotonic constraints, interaction constraints, feature_weights, first_feature_use_penalties, random_strength, and grow_policy="LeafWise" without replacing the conditional-inference split gate
  • Survival objectives: Cox, SurvivalExponential
  • Survival evaluation through CIndex
  • Early stopping with eval_set, eval_names, early_stopping_rounds, early_stopping_metric, and early_stopping_name
  • Single- and multi-watchlist evaluation through one or many eval_set entries
  • Single- and multi-metric evaluation through string or sequence eval_metric values
  • Per-iteration callback hooks through callbacks, plus built-in ctboost.log_evaluation(...) and ctboost.checkpoint_callback(...)
  • Validation loss/metric history and evals_result_
  • Per-iteration prediction through staged prediction and num_iteration
  • Stable JSON and pickle model persistence for low-level boosters and scikit-learn style estimators
  • Cross-validation with ctboost.cv(...) when scikit-learn is installed
  • Regression objectives: RMSE, MAE, Huber, Quantile, Poisson, Tweedie
  • Generic eval metrics including RMSE, MAE, Poisson, Tweedie, Accuracy, BalancedAccuracy, Precision, Recall, F1, AUC, NDCG, MAP, MRR, and CIndex
  • Native ctboost.FeaturePipeline logic in _core.NativeFeaturePipeline, with low-level and sklearn integration for ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding-stat expansion
  • Generic categorical controls around the existing conditional tree learner: one_hot_max_size / max_cat_to_onehot, max_cat_threshold, simple_ctr, combinations_ctr, and per_feature_ctr
  • ctboost.prepare_pool(...) for low-level raw-data preparation, optional feature-pipeline fitting, and disk-backed external-memory pool staging
  • Native CPU out-of-core fit through ctboost.train(..., external_memory=True), which now spills quantized feature-bin columns to disk instead of keeping the full histogram matrix resident in RAM
  • Multi-host distributed training through distributed_world_size, distributed_rank, distributed_root, and distributed_run_id, with a native per-node histogram reduction path and a TCP collective backend available through distributed_root="tcp://host:port"
  • Distributed eval_set, multi-watchlist or multi-metric evaluation, callbacks, early_stopping_rounds, init_model, grouped ranking shards, and sklearn-estimator wrappers on the TCP collective backend
  • Distributed GPU training when CUDA is available and distributed_root uses the TCP collective backend
  • Distributed raw-data feature-pipeline fitting across ranks for native categorical, text, and embedding preprocessing
  • Feature importance reporting
  • Leaf-index introspection and path-based prediction contributions
  • Continued training through init_model and estimator warm_start
  • Build metadata reporting through ctboost.build_info()
  • CPU builds on standard CI runners
  • Optional CUDA compilation when building from source with a suitable toolkit
  • GPU source builds now keep fit-scoped histogram data resident on device, support shared-memory histogram accumulation, and expose GPU raw-score prediction for regression, binary classification, and multiclass models
  • Histogram building now writes directly into final-width compact storage when the fitted schema permits <=256 bins, avoiding the old transient uint16 -> uint8 duplication spike
  • Fitted models now store quantization metadata once per booster instead of duplicating the same schema in every tree
  • Low-level boosters can export reusable fitted borders through Booster.get_borders() and expose the full shared quantization schema through Booster.get_quantization_schema()
  • GPU fit now drops the host training histogram bin matrix immediately after the device histogram workspace has been created and warm-start predictions have been seeded
  • GPU tree building now uses histogram subtraction in the device path as well, so only one child histogram is built explicitly after each split
  • GPU node search now keeps best-feature selection on device and returns a compact winner instead of copying the full per-feature search buffer back to host each node
  • Training can emit native histogram/tree timing via verbose=True or CTBOOST_PROFILE=1

Current Limitations

  • Ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding expansion now run through a native C++ pipeline, while pandas extraction, raw-data routing, and Pool orchestration remain thin Python glue
  • There is now a native sparse training path plus disk-backed quantized-bin staging through ctboost.train(..., external_memory=True) on both CPU and GPU, and distributed training can also use a standalone TCP collective coordinator through distributed_root="tcp://host:port"
  • The legacy filesystem-based distributed path still exists for basic CPU shard training, but distributed eval_set support and distributed GPU execution require the TCP backend
  • Distributed grouped/ranking training requires each group_id to live entirely on one worker shard; cross-rank query groups are rejected
  • Dedicated GPU wheel automation targets Linux x86_64 CPython 3.10 through 3.14 release assets for Kaggle-style environments
  • CUDA wheel builds in CI depend on container-side toolkit provisioning

Resolved Fold-Memory Hotspots

The older v0.1.15 GPU fit-memory bottleneck list is now closed in the current tree:

  • Quantization metadata is stored once per fitted booster and shared by all trees instead of being duplicated per tree
  • GPU fit releases the host training histogram bin matrix immediately after device workspace creation and warm-start seeding
  • GPU tree growth uses histogram subtraction, so only one child histogram is built explicitly after a split
  • GPU split search keeps best-feature selection on device and copies back only the winning feature summary

That means the old per-node GPU bin-materialization issue is no longer the main resident-memory problem in the current codebase. The remaining generic backlog is now in broader distributed runtime ergonomics and additional export or deployment tooling.

Benchmark Snapshot

The heavy ordered-target-encoding playground-series-s6e4 replay was last measured on April 12, 2026 with the v0.1.11 source tree. The one-fold Kaggle source-build replay completed successfully with:

  • build 55.41s
  • fold preprocess 57.17s
  • fold fit 2107.10s
  • fold predict 5.89s
  • fold total 2170.17s
  • validation score 0.973213

Since that replay, the source tree has removed additional fit-memory overhead by sharing quantization schema per model, building compact train bins without a second host copy, releasing host train-bin storage after GPU upload, and adding GPU histogram subtraction plus device-side best-feature reduction.

Installation

For local development or source builds:

pip install .

Install development dependencies:

pip install -e .[dev]

Install the optional scikit-learn wrappers and ctboost.cv(...) support:

pip install -e .[sklearn]

Wheels vs Source Builds

pip install ctboost works without a compiler only when PyPI has a prebuilt wheel for your exact Python/OS tag. If no matching wheel exists, pip falls back to the source distribution and has to compile the native extension locally.

The release workflow is configured to publish CPU wheels for current CPython releases on Windows and Linux, plus macOS x86_64 CPU wheels for CPython 3.10 through 3.14, so standard pip install ctboost usage does not depend on a local compiler.

Each tagged GitHub release also attaches the CPU wheels, the source distribution, and dedicated Linux x86_64 CUDA wheels for CPython 3.10 through 3.14. The GPU wheel filenames carry a 1gpu build tag so the release can publish CPU and GPU artifacts for the same Python and platform tags without filename collisions.

The GPU release job installs the CUDA compiler plus the CUDA runtime development package, exports the toolkit paths into the build environment, and sets CTBOOST_REQUIRE_CUDA=ON so the wheel build fails instead of silently degrading to a CPU-only artifact. The release smoke test also checks that ctboost.build_info()["cuda_enabled"] is True before the GPU wheel is uploaded.

Kaggle GPU Install

pip install ctboost still resolves to the CPU wheel on PyPI. On Kaggle, install the matching GPU release wheel from GitHub instead:

import json
import subprocess
import sys
import urllib.request

tag = "v0.1.26"
py_tag = f"cp{sys.version_info.major}{sys.version_info.minor}"
api_url = f"https://api.github.com/repos/captnmarkus/ctboost/releases/tags/{tag}"

with urllib.request.urlopen(api_url) as response:
    release = json.load(response)

asset = next(
    item
    for item in release["assets"]
    if item["name"].endswith(".whl") and f"-1gpu-{py_tag}-{py_tag}-" in item["name"]
)

subprocess.check_call(
    [sys.executable, "-m", "pip", "install", "-U", asset["browser_download_url"]]
)

After installation, confirm the wheel really contains CUDA support:

import ctboost

info = ctboost.build_info()
if not info["cuda_enabled"]:
    raise RuntimeError(f"Expected a CUDA-enabled CTBoost wheel, got: {info}")
print(info)

CPU-Only Source Build

To force a CPU-only native build:

CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF" pip install .

On PowerShell:

$env:CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF"
pip install .

Windows source builds require a working C++ toolchain. In practice that means Visual Studio Build Tools 2022 or a compatible MSVC environment, plus CMake. ninja is recommended, but it does not replace the compiler itself.

CUDA Source Build

CTBoost can compile a CUDA backend when the CUDA toolkit and compiler are available. CUDA is enabled by default in CMake, but the build automatically falls back to CPU-only when no toolkit is detected.

pip install .

You can inspect the compiled package after installation:

import ctboost
print(ctboost.build_info())

Quick Start

scikit-learn Style Classification

import pandas as pd
from sklearn.datasets import make_classification

from ctboost import CTBoostClassifier

X, y = make_classification(
    n_samples=256,
    n_features=8,
    n_informative=5,
    n_redundant=0,
    random_state=13,
).astype("float32")
X = pd.DataFrame(X, columns=[f"f{i}" for i in range(X.shape[1])])
X["segment"] = pd.Categorical(["a" if i % 2 == 0 else "b" for i in range(len(X))])
y = y.astype("float32")

model = CTBoostClassifier(
    iterations=256,
    learning_rate=0.1,
    max_depth=3,
    alpha=1.0,
    lambda_l2=1.0,
    task_type="CPU",
)

model.fit(
    X.iloc[:200],
    y[:200],
    eval_set=[(X.iloc[200:], y[200:])],
    early_stopping_rounds=20,
)
proba = model.predict_proba(X)
pred = model.predict(X)
importance = model.feature_importances_
best_iteration = model.best_iteration_

Low-Level Training API

import numpy as np

import ctboost

X = np.array([[0.0, 1.0], [1.0, 0.0], [0.5, 0.5]], dtype=np.float32)
y = np.array([0.0, 1.0, 0.5], dtype=np.float32)

pool = ctboost.Pool(X, y)
booster = ctboost.train(
    pool,
    {
        "objective": "Huber",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 64,
        "huber_delta": 1.5,
        "eval_metric": "MAE",
        "nan_mode": "Min",
        "task_type": "CPU",
    },
    num_boost_round=10,
)

predictions = booster.predict(pool)
loss_history = booster.loss_history
eval_loss_history = booster.eval_loss_history
exported_borders = booster.get_borders()

Per-feature quantization controls are available on the same low-level API:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 128,
        "max_bin_by_feature": {0: 16, 1: 8},
        "border_selection_method": "Uniform",
        "feature_borders": {1: [-0.5, 0.0, 0.5]},
        "nan_mode_by_feature": {0: "Max"},
    },
    num_boost_round=32,
)

feature_borders lets selected numeric features reuse explicit cut values, max_bin_by_feature overrides the global max_bins budget per column, border_selection_method currently supports Quantile and Uniform, and Booster.get_borders() returns an importable border bundle keyed by fitted feature index.

The same low-level API also exposes generic regularization and growth controls around the existing conditional tree learner:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 4,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "bootstrap_type": "Bayesian",
        "bagging_temperature": 1.0,
        "feature_weights": {0: 2.0, 3: 0.5},
        "first_feature_use_penalties": {2: 1.5},
        "random_strength": 0.2,
        "grow_policy": "LeafWise",
        "max_leaves": 16,
        "min_samples_split": 8,
        "max_leaf_weight": 2.0,
    },
    num_boost_round=64,
)

feature_weights rescales feature preference without replacing the conditional test, first_feature_use_penalties discourages the first use of selected features at the model level, random_strength adds seeded noise to break near-ties in split gain after the conditional gate has already accepted a candidate, and grow_policy="LeafWise" currently means a best-child-first heuristic under the existing max_leaves budget rather than a separate split criterion.

The same low-level API can now prepare raw categorical/text/embedding inputs directly:

import numpy as np
import ctboost

X = np.empty((4, 4), dtype=object)
X[:, 0] = ["berlin", "paris", "berlin", "rome"]
X[:, 1] = [1.0, 2.0, 1.5, 3.0]
X[:, 2] = ["red fox", "blue fox", "red hare", "green fox"]
X[:, 3] = [
    np.array([0.1, 0.4, 0.2], dtype=np.float32),
    np.array([0.7, 0.1, 0.3], dtype=np.float32),
    np.array([0.2, 0.5, 0.6], dtype=np.float32),
    np.array([0.9, 0.2, 0.4], dtype=np.float32),
]
y = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

booster = ctboost.train(
    X,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "one_hot_max_size": 4,
        "max_cat_threshold": 16,
        "cat_features": [0],
        "simple_ctr": ["Mean", "Frequency"],
        "per_feature_ctr": {0: ["Mean"]},
        "text_features": [2],
        "embedding_features": [3],
    },
    label=y,
    num_boost_round=32,
)

raw_predictions = booster.predict(X)

For disk-backed pool staging on large folds:

pool = ctboost.prepare_pool(
    X_numeric,
    y,
    external_memory=True,
    external_memory_dir="ctboost-cache",
)

booster = ctboost.train(
    X_numeric,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "external_memory": True,
        "external_memory_dir": "ctboost-cache",
    },
    label=y,
    num_boost_round=64,
)

Working With Categorical Features

Categorical columns can still be marked manually through the Pool API:

import numpy as np
import ctboost

X = np.array([[0.0], [1.0], [2.0], [3.0]], dtype=np.float32)
y = np.array([1.0, 0.0, 1.0, 0.0], dtype=np.float32)

pool = ctboost.Pool(X, y, cat_features=[0])

For pandas inputs, categorical/object columns are detected automatically:

import pandas as pd
import ctboost

frame = pd.DataFrame(
    {
        "value": [1.0, 2.0, 3.0, 4.0],
        "city": pd.Categorical(["berlin", "paris", "berlin", "rome"]),
        "segment": ["retail", "enterprise", "retail", "enterprise"],
    }
)
label = pd.Series([0.0, 1.0, 0.0, 1.0], dtype="float32")

pool = ctboost.Pool(frame, label)
assert pool.cat_features == [1, 2]

For estimator-side ordered CTRs, categorical crosses, one-hot expansion, rare-category bucketing, text hashing, and embedding expansion, use the Python feature pipeline parameters:

import numpy as np
import pandas as pd

from ctboost import CTBoostRegressor

frame = pd.DataFrame(
    {
        "city": ["berlin", "paris", "berlin", "rome"],
        "headline": ["red fox", "blue fox", "red hare", "green fox"],
        "embedding": [
            np.array([0.1, 0.4, 0.2], dtype=np.float32),
            np.array([0.7, 0.1, 0.3], dtype=np.float32),
            np.array([0.2, 0.5, 0.6], dtype=np.float32),
            np.array([0.9, 0.2, 0.4], dtype=np.float32),
        ],
        "value": [1.0, 2.0, 1.5, 3.0],
    }
)
label = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

model = CTBoostRegressor(
    iterations=32,
    learning_rate=0.1,
    max_depth=3,
    ordered_ctr=True,
    one_hot_max_size=8,
    max_cat_threshold=32,
    cat_features=["city"],
    categorical_combinations=[["city", "headline"]],
    simple_ctr=["Mean", "Frequency"],
    per_feature_ctr={"city": ["Mean"]},
    text_features=["headline"],
    embedding_features=["embedding"],
)
model.fit(frame, label)

one_hot_max_size keeps low-cardinality categoricals as explicit indicator columns, max_cat_threshold buckets higher-cardinality levels down to a capped native categorical domain before the conditional tree learner sees them, and per_feature_ctr lets specific base features or categorical combinations opt into CTR generation without changing the underlying conditional split logic.

Model Persistence, Warm Start, And Cross-Validation

import ctboost

booster.save_model("regression-model.json")
restored = ctboost.load_model("regression-model.json")
restored_predictions = restored.predict(pool)

continued = ctboost.train(
    pool,
    {"objective": "RMSE", "learning_rate": 0.2, "max_depth": 2, "alpha": 1.0, "lambda_l2": 1.0},
    num_boost_round=10,
    init_model=restored,
)

cv_result = ctboost.cv(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
    },
    num_boost_round=25,
    nfold=3,
)

The scikit-learn compatible estimators also expose:

  • save_model(...)
  • load_model(...)
  • staged_predict(...)
  • staged_predict_proba(...) for classifiers
  • predict_leaf_index(...)
  • predict_contrib(...)
  • evals_result_
  • best_score_
  • sample_weight on fit(...)
  • class_weight, scale_pos_weight, eval_metric, nan_mode, nan_mode_by_feature, and warm_start
  • max_bins, max_bin_by_feature, border_selection_method, and feature_borders
  • bagging_temperature, feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight

Public Python API

The main entry points are:

  • ctboost.Pool
  • ctboost.FeaturePipeline
  • ctboost.prepare_pool
  • ctboost.train
  • ctboost.cv
  • ctboost.Booster
  • ctboost.CTBoostClassifier
  • ctboost.CTBoostRanker
  • ctboost.CTBoostRegressor
  • ctboost.CBoostClassifier
  • ctboost.CBoostRanker
  • ctboost.CBoostRegressor
  • ctboost.build_info
  • ctboost.load_model

Build and Test

Run the test suite:

pytest tests

The latest local release-candidate validation on April 13, 2026 was:

python -m pytest -q

Build an sdist:

python -m build --sdist

Configure and build the native extension directly with CMake:

python -m pip install pybind11 numpy pandas scikit-learn pytest
cmake -S . -B build -DCTBOOST_ENABLE_CUDA=OFF -Dpybind11_DIR="$(python -m pybind11 --cmakedir)"
cmake --build build --config Release --parallel

Wheel builds are configured through cibuildwheel for:

  • Windows amd64
  • Linux x86_64 and aarch64 using the current manylinux baseline
  • macOS x86_64
  • CPython 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, and 3.14

GitHub Actions workflows:

  • .github/workflows/cmake.yml: configures, builds, installs, and tests CPU builds on Ubuntu, Windows, and macOS for pushes and pull requests
  • .github/workflows/publish.yml: builds release wheels and the sdist, runs wheel smoke tests on built artifacts, publishes CPU wheels to PyPI, and attaches both CPU and Linux GPU wheels to tagged GitHub releases

The standard PyPI release wheel workflow builds CPU-only wheels by setting:

cmake.define.CTBOOST_ENABLE_CUDA=OFF

The Linux GPU release-wheel matrix enables CUDA separately with:

cmake.define.CTBOOST_ENABLE_CUDA=ON
cmake.define.CTBOOST_REQUIRE_CUDA=ON
cmake.define.CMAKE_CUDA_COMPILER=/usr/local/cuda-12.0/bin/nvcc
cmake.define.CUDAToolkit_ROOT=/usr/local/cuda-12.0
cmake.define.CMAKE_CUDA_ARCHITECTURES=60;70;75;80;86;89
wheel.build-tag=1gpu

Project Layout

ctboost/      Python API layer
include/      public C++ headers
src/core/     core boosting, objectives, trees, statistics
src/bindings/ pybind11 extension bindings
cuda/         optional CUDA backend
tests/        Python test suite

License

Apache 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctboost-0.1.26.tar.gz (302.8 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ctboost-0.1.26-cp314-cp314-win_amd64.whl (472.8 kB view details)

Uploaded CPython 3.14Windows x86-64

ctboost-0.1.26-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (657.3 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (591.5 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp314-cp314-macosx_10_15_x86_64.whl (505.7 kB view details)

Uploaded CPython 3.14macOS 10.15+ x86-64

ctboost-0.1.26-cp313-cp313-win_amd64.whl (459.9 kB view details)

Uploaded CPython 3.13Windows x86-64

ctboost-0.1.26-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (656.8 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (590.2 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp313-cp313-macosx_10_15_x86_64.whl (505.4 kB view details)

Uploaded CPython 3.13macOS 10.15+ x86-64

ctboost-0.1.26-cp312-cp312-win_amd64.whl (459.9 kB view details)

Uploaded CPython 3.12Windows x86-64

ctboost-0.1.26-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (657.2 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (590.6 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp312-cp312-macosx_10_15_x86_64.whl (505.3 kB view details)

Uploaded CPython 3.12macOS 10.15+ x86-64

ctboost-0.1.26-cp311-cp311-win_amd64.whl (459.7 kB view details)

Uploaded CPython 3.11Windows x86-64

ctboost-0.1.26-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (653.6 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (588.3 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp311-cp311-macosx_10_15_x86_64.whl (503.3 kB view details)

Uploaded CPython 3.11macOS 10.15+ x86-64

ctboost-0.1.26-cp310-cp310-win_amd64.whl (458.7 kB view details)

Uploaded CPython 3.10Windows x86-64

ctboost-0.1.26-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (651.5 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (586.1 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp310-cp310-macosx_10_15_x86_64.whl (501.6 kB view details)

Uploaded CPython 3.10macOS 10.15+ x86-64

ctboost-0.1.26-cp39-cp39-win_amd64.whl (464.7 kB view details)

Uploaded CPython 3.9Windows x86-64

ctboost-0.1.26-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (650.2 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (584.8 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.26-cp38-cp38-win_amd64.whl (458.2 kB view details)

Uploaded CPython 3.8Windows x86-64

ctboost-0.1.26-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (649.5 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.26-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (584.4 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

File details

Details for the file ctboost-0.1.26.tar.gz.

File metadata

  • Download URL: ctboost-0.1.26.tar.gz
  • Upload date:
  • Size: 302.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26.tar.gz
Algorithm Hash digest
SHA256 e99497c1ca38dc26e98e48a3c720b4bfead43ca808059c7cc67e71ad87cef220
MD5 0fe9da9724fdfdb12790e6b65437e025
BLAKE2b-256 141896902d2743ea3a22bf48312a71061e3fe9be9a4313e8b4a634c98ffc7032

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26.tar.gz:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 472.8 kB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 e6307426c6a7af2f3e45645f947bc5df27aaebcee50cd8958bf704fa4ac37c7b
MD5 73978e6dc01c7fa921113bb34e2e3d44
BLAKE2b-256 21da9dd62cc2a4d93cd19c6cd38eca3e3bb40b5d54e44c361a00f9aa4a540571

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp314-cp314-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 b56fe10733eaab832602959fb8c98d982771fcfad68ceb33e71c4b7756585462
MD5 999cda9d0d69f8d5d0244286fbbd0e1b
BLAKE2b-256 503710814525225de5d7e2f60e10a627d1bd48f8010128b3cce5531327758021

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7f279e6a91061bf2e7ea73552118512dc6ab74d77ae2d873324248f312addf3e
MD5 bc434987622ba3f58951d6eff6094b91
BLAKE2b-256 240d183e9104fa1c122819f228b3fdade2cd6466d1711cb91173432b50e00f76

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp314-cp314-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp314-cp314-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 0490779d3dd4684bf9eeabe3491f4e792f90d5f8ed793c0e7bdb64a90392e0d5
MD5 d7105a958def31804a6afc4aea9e1ea5
BLAKE2b-256 55653a3a19e21350f9f3bea78b3b4769ed85e4a48d4c6335e653cd7d0163c7f5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp314-cp314-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 459.9 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 196fbd1226e71bf03b1e5835ba6d7226d29ab0b6dbdad8b749d1c3d3af451671
MD5 98da2c4dee1fedb96e50af70ecddcc7f
BLAKE2b-256 0a3db1e40776caed91944bb0578f5ebf30ec55f1225c6d8cc83db03c275dee5b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 43215941363f2296b6485854d509c9fa89a2d4092b3523651b21fcabf0bb9a41
MD5 7e866a75c1b63309fa284048527a81bd
BLAKE2b-256 07906fbfd0f19785a32c5ff9fb0ddcffb7e329bad176cbb7ef5c3e6955ffeff2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 fb5ae6e8711a2a1fd01e931ad18933fdbdc781d6223d254158f98dd3221b2b33
MD5 5417e328b8e2eb8ffb79597a9c0de5c8
BLAKE2b-256 9f015eef9e95214dca5c22b88aeea8cbbcce9ef887387008f42b8bc091790986

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp313-cp313-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp313-cp313-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 b44d7b18b18ee40166a93f6fc06f17f6a6f661a20102773489972025228817e4
MD5 d9ec43686ef3fd7d56ae5743ed7f2e09
BLAKE2b-256 fd9b81cba84946b4495e996a73a016b3de1987523fb3bb0d404a330438199bee

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp313-cp313-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 459.9 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 8cca3ef3fac3a43e00bdf671d8c041d8e282d39497dc54ab84038407ffe7a33d
MD5 f3b1423c576beddc5e17a0745a4c1b12
BLAKE2b-256 e3401dbb763f21df1fd65ff9d92ca1d7ff489a7343391792a4ca1f4372c0b782

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 4857b595bd18d5a863c56c82470b4fe4030575ac928c36d706b3a420a354e4d7
MD5 72c8d23a74dba6d8c07dd8f73935d7cd
BLAKE2b-256 c5a3cdd6b2c7e4cda9aa2e52c500f0d44ab376e0f9b1e7901299c843ec16605e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 3054cac8e3fce037d7a6114e5483b731d6c5cfbc09faa8da39dc9eb3572f34eb
MD5 11ec9fcfcf8517d549f2813f31a968b8
BLAKE2b-256 c87d4f16f843e53b458000857391ff7cfbbde6ba2f88eb20c91d6c7af66e375f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp312-cp312-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp312-cp312-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 7ec4f8553e73dcd211cedb35c7480bcb240f2f50ddfba7f71e1a9446627b1e12
MD5 01f71c70db69cc9568b7e4ffabd1de35
BLAKE2b-256 c28142754a004ea6ac48b70aaaadbb17ed1213742642b4b8544959e60aa2a76e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp312-cp312-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 459.7 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 955b78f6f467e45dfe6ca0971b5983a570e6f558bf3fee10f727ad456e64b68c
MD5 f1f6806d64be4ee66fe668be185f8ab1
BLAKE2b-256 0bf61a21c9c752d4814e4c89357f0287f7f2b6d7921d11c77f427dfc565aa67a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 48195f8dc6696b42fb04086b73153a41011090cdfc21a257fd832ec635212c22
MD5 7b5985212da115c27ca8a8b80495311e
BLAKE2b-256 fc7f9d31def801c146a4ee3c9b08ae0953af321c3e99632b9bdc6b2461828acf

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 94952bda2dcb5c354130ff34d54a176d3efba896cbeb6ab1b060ac1302e86d12
MD5 dabc093c4aeb192818b12f6bb245396c
BLAKE2b-256 5ba72f818e3ef1c1a514ee7a812f7a9e43489beb0ac364502f84c1a38266b22d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp311-cp311-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp311-cp311-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 c59960fe6160867400944bda00adfa0534ff3cdcfab8a80690b35290d28d5740
MD5 bd947753dae1759c22d6113b1d3e9aa1
BLAKE2b-256 f75f68120f59ba23ef7e01dde6154a65a241addea7a600e0b77130a9dfbb7b78

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp311-cp311-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 458.7 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 2c65b78ad28fde6d66151eb2b712f2b7b86b56f899511dcdd2d9efb44f3e94a3
MD5 d795f1501a4a591c1c8058b990bb79e8
BLAKE2b-256 dae80454688d754654c4300e97aa24701fa9cbcef1b6fe00da2b0da3cefcc839

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d5018e0b07977d1e2cefadbe693bc0373c8b17a906412ed40a8b93074ac60c75
MD5 44a87295849f1e599b117745c6cc6d63
BLAKE2b-256 253294862c7c467946a784ce109aa454a58df60877673c469a463deb01b0c849

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 c9159cdec16a95846a35905c3cd26e6430a2e91da7de69d141176048024195fd
MD5 046d447e968170683df8d29b8a1e5e5e
BLAKE2b-256 d3fb69a71f66400595282de8850e267136c0ff19d6708532b9c666a241b81e43

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp310-cp310-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp310-cp310-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 8fa588778b823ba5dd260fc3780ab8ca509be6bfbce3cf4f7abc58d0cc2cab2a
MD5 ae89bc32a41e3bf94a4ef324e29f6564
BLAKE2b-256 1f5f823e4de2bde8c4e260e65da1cf0f222915e3830611464c4bd74db5be3f63

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp310-cp310-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 464.7 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 5ff2c2ea6f50b3d4f8ace51e5ae4aceae32a3091a4682712177eca837c438d85
MD5 faf9c5a48ad2b2ea9d4211d4050391c8
BLAKE2b-256 91a3d466f1bc3e742b8a5290c88621d2dca13f4f3e3f0f3014035662b7935b06

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp39-cp39-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 66c9440c65b5a8d310be421dc21244ee94a249f49ce3b259953e9b7fa221630d
MD5 44f79540b011d5386b5173be45e20cbb
BLAKE2b-256 7001f303473d5d755e51196d7ff6f3d9a6365845f6d7fc8b5c1dff8eb652fe5b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 1dc4c58b43b46f977e3928031dfc6737135600bcde46957538326dc279e067e4
MD5 f885312daad5efb1f7eda7db1c5725d4
BLAKE2b-256 4db46c629a058c4b5995ddd35fee6ecde0cde2a99366f4d7fb420ec59af9296a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.26-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 458.2 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.26-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 fcc9d2898182bdc85a7ad344f471dba7b9f052f5842fa9dc1d48c4f51a95a874
MD5 5d4bd8cbab9d6118d4ef349366b70e5d
BLAKE2b-256 56bdc52e134487a1f95ed66b3c8dc9c7273ac6ca19485a10b8dc3f53d1d97b63

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp38-cp38-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 4325806e8d0a11c04702281c47214e0275cf53a70784ad072582ab5db75c1023
MD5 74355d845200ff631d938da155acaaa2
BLAKE2b-256 ebba937d3750893d0a307624ec78b5142368a72d664cb6db4be44e56dd24d99c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.26-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.26-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 8147c164fdefb4f612730da58a8f8869d852d7800fba87e30a5c9a2cf355eb46
MD5 5b27b8910092d53953182617e494ddb9
BLAKE2b-256 099f4da42792baad40b4bd9772f24dcdd0a4753082645b8ca1bf9a6d7aa97955

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.26-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page