Skip to main content

A GPU-accelerated gradient boosting library using Conditional Inference Trees.

Project description

CTBoost

CTBoost is a gradient boosting library built around Conditional Inference Trees, with a native C++17 core, Python bindings via pybind11, optional CUDA support for source builds, and an optional scikit-learn style API.

The current codebase supports end-to-end training and prediction for regression, classification, grouped ranking, and survival, plus pandas and SciPy sparse ingestion without dense expansion, richer ranking metadata (group_weight, subgroup_id, explicit pairs, and pairs_weight), baseline raw-score inputs, row weights and class imbalance controls, explicit missing-value handling, configurable validation metrics, stable JSON model persistence, standalone Python export for prepared numeric models, staged prediction, warm-start continuation, a native C++ feature pipeline for categorical/text/embedding transforms with thin Python wrappers, reusable prepared training-data bundles, and a built-in cross-validation helper.

Current Status

  • Language mix: Python + C++17, with optional CUDA
  • Python support: 3.8 through 3.14
  • Packaging: scikit-build-core
  • CI/CD: GitHub Actions for CMake validation and cibuildwheel release builds
  • Repository version: 0.1.39
  • Status: actively evolving native + Python package

What Works Today

  • Native gradient boosting backend exposed as ctboost._core
  • Pool abstraction for dense tabular data, SciPy sparse input, categorical feature indices, optional ranking/query metadata (group_id, group_weight, subgroup_id), explicit ranking pairs plus pairs_weight, and optional baseline raw scores
  • Native pandas DataFrame and Series support
  • Automatic categorical detection for pandas category and object columns
  • Regression training with ctboost.train(...), including raw array/DataFrame inputs plus optional preprocessing and external-memory staging
  • scikit-learn compatible CTBoostClassifier, CTBoostRegressor, and CTBoostRanker when scikit-learn is installed
  • Binary and multiclass classification
  • Grouped ranking with PairLogit, NDCG, MAP, MRR, explicit ranking pairs, subgroup-aware auto-pair generation, and query-level weights
  • Row weights through Pool(..., weight=...) and sample_weight on sklearn estimators
  • Class imbalance controls through class_weight, class_weights, auto_class_weights="balanced", and scale_pos_weight
  • Explicit missing-value handling through nan_mode
  • Quantization controls through max_bins, max_bin_by_feature, border_selection_method, feature_borders, and nan_mode_by_feature
  • Row subsampling through subsample plus bootstrap_type="No"|"Bernoulli"|"Poisson"
  • Bayesian bagging through bootstrap_type="Bayesian" plus bagging_temperature
  • boosting_type="RandomForest" on top of the existing conditional-inference tree learner
  • boosting_type="DART" with dropout-style tree normalization on top of the existing conditional-inference tree learner
  • Monotonic constraints through monotone_constraints
  • Path-level interaction constraints through interaction_constraints
  • Additional generic regularization and tree-growth controls through feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight
  • GPU tree growth now also supports monotonic constraints, interaction constraints, feature_weights, first_feature_use_penalties, random_strength, and grow_policy="LeafWise" without replacing the conditional-inference split gate
  • Survival objectives: Cox, SurvivalExponential
  • Survival evaluation through CIndex
  • Early stopping with eval_set, eval_names, early_stopping_rounds, early_stopping_metric, and early_stopping_name
  • Single- and multi-watchlist evaluation through one or many eval_set entries
  • Single- and multi-metric evaluation through string or sequence eval_metric values
  • Per-iteration callback hooks through callbacks, plus built-in ctboost.log_evaluation(...) and ctboost.checkpoint_callback(...)
  • Validation loss/metric history and evals_result_
  • Per-iteration prediction through staged prediction and num_iteration
  • Stable JSON and pickle model persistence for low-level boosters and scikit-learn style estimators
  • Cross-validation with ctboost.cv(...) when scikit-learn is installed
  • Regression objectives: RMSE, MAE, Huber, Quantile, Poisson, Tweedie
  • Generic eval metrics including RMSE, MAE, Poisson, Tweedie, Accuracy, BalancedAccuracy, Precision, Recall, F1, AUC, NDCG, MAP, MRR, and CIndex
  • Native ctboost.FeaturePipeline logic in _core.NativeFeaturePipeline, with low-level and sklearn integration for ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding-stat expansion
  • Generic categorical controls around the existing conditional tree learner: one_hot_max_size / max_cat_to_onehot, max_cat_threshold, simple_ctr, combinations_ctr, and per_feature_ctr
  • ctboost.prepare_pool(...) for low-level raw-data preparation, optional feature-pipeline fitting, and disk-backed external-memory pool staging
  • ctboost.prepare_training_data(...) plus PreparedTrainingData for one-time raw train/eval preparation that can be reused across repeated fits
  • Native CPU out-of-core fit through ctboost.train(..., external_memory=True), which now spills quantized feature-bin columns to disk instead of keeping the full histogram matrix resident in RAM
  • Multi-host distributed training through distributed_world_size, distributed_rank, distributed_root, and distributed_run_id, with a native per-node histogram reduction path and a TCP collective backend available through distributed_root="tcp://host:port"
  • Distributed eval_set, multi-watchlist or multi-metric evaluation, callbacks, early_stopping_rounds, init_model, grouped ranking shards, and sklearn-estimator wrappers on the TCP collective backend
  • Filesystem-backed distributed runs now also fall back to a rank-0 coordinator path for advanced eval, callback, ranking, and GPU compatibility flows when TCP is not configured
  • Distributed GPU training when CUDA is available and distributed_root uses the TCP collective backend
  • Distributed raw-data feature-pipeline fitting across ranks for native categorical, text, and embedding preprocessing
  • Feature importance reporting
  • Leaf-index introspection and path-based prediction contributions
  • Continued training through init_model and estimator warm_start
  • Standalone pure-Python deployment export through Booster.export_model(..., export_format="python") and matching sklearn-estimator wrappers for numeric or already-prepared features
  • Build metadata reporting through ctboost.build_info()
  • CPU builds on standard CI runners
  • Optional CUDA compilation when building from source with a suitable toolkit
  • GPU source builds now keep fit-scoped histogram data resident on device, support shared-memory histogram accumulation, and expose GPU raw-score prediction for regression, binary classification, and multiclass models
  • Histogram building now writes directly into final-width compact storage when the fitted schema permits <=256 bins, avoiding the old transient uint16 -> uint8 duplication spike
  • Fitted models now store quantization metadata once per booster instead of duplicating the same schema in every tree
  • Low-level boosters can export reusable fitted borders through Booster.get_borders() and expose the full shared quantization schema through Booster.get_quantization_schema()
  • GPU fit now drops the host training histogram bin matrix immediately after the device histogram workspace has been created and warm-start predictions have been seeded
  • GPU tree building now uses histogram subtraction in the device path as well, so only one child histogram is built explicitly after each split
  • GPU node search now keeps best-feature selection on device and returns a compact winner instead of copying the full per-feature search buffer back to host each node
  • Training can emit native histogram/tree timing via verbose=True or CTBOOST_PROFILE=1

Current Limitations

  • Ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding expansion now run through a native C++ pipeline, while pandas extraction, raw-data routing, and Pool orchestration remain thin Python glue; ctboost.prepare_training_data(...) reduces that repeated Python work when you need to fit multiple times on the same raw train/eval split
  • There is now a native sparse training path plus disk-backed quantized-bin staging through ctboost.train(..., external_memory=True) on both CPU and GPU, and distributed training can also use a standalone TCP collective coordinator through distributed_root="tcp://host:port"
  • The legacy filesystem-based distributed path still exists for the native shard-reduction path; advanced eval, callback, ranking, and GPU compatibility workflows now fall back to a rank-0 coordinator path, while the TCP backend remains the true multi-rank path for those features
  • Distributed grouped/ranking training requires each group_id to live entirely on one worker shard; cross-rank query groups are rejected
  • Distributed multi-host training does not yet accept group_weight, subgroup_id, or explicit pairs / pairs_weight metadata on rank-local pools
  • Dedicated GPU wheel automation now targets Linux x86_64 and Windows amd64 CPython 3.10 through 3.14 release assets
  • CUDA wheel builds in CI depend on container-side toolkit provisioning

Resolved Fold-Memory Hotspots

The older v0.1.15 GPU fit-memory bottleneck list is now closed in the current tree:

  • Quantization metadata is stored once per fitted booster and shared by all trees instead of being duplicated per tree
  • GPU fit releases the host training histogram bin matrix immediately after device workspace creation and warm-start seeding
  • GPU tree growth uses histogram subtraction, so only one child histogram is built explicitly after a split
  • GPU split search keeps best-feature selection on device and copies back only the winning feature summary

That means the old per-node GPU bin-materialization issue is no longer the main resident-memory problem in the current codebase. The remaining generic backlog is now in broader distributed runtime ergonomics and additional export or deployment tooling.

Benchmark Snapshot

The heavy ordered-target-encoding playground-series-s6e4 replay was last measured on April 12, 2026 with the v0.1.11 source tree. The one-fold Kaggle source-build replay completed successfully with:

  • build 55.41s
  • fold preprocess 57.17s
  • fold fit 2107.10s
  • fold predict 5.89s
  • fold total 2170.17s
  • validation score 0.973213

Since that replay, the source tree has removed additional fit-memory overhead by sharing quantization schema per model, building compact train bins without a second host copy, releasing host train-bin storage after GPU upload, and adding GPU histogram subtraction plus device-side best-feature reduction.

Installation

For local development or source builds:

pip install .

Install development dependencies:

pip install -e .[dev]

Install the optional scikit-learn wrappers and ctboost.cv(...) support:

pip install -e .[sklearn]

Wheels vs Source Builds

pip install ctboost works without a compiler only when PyPI has a prebuilt wheel for your exact Python/OS tag. If no matching wheel exists, pip falls back to the source distribution and has to compile the native extension locally.

The release workflow is configured to publish CPU wheels for current CPython releases on Windows and Linux, plus macOS x86_64 CPU wheels for CPython 3.10 through 3.14, so standard pip install ctboost usage does not depend on a local compiler.

Each tagged GitHub release also attaches the CPU wheels, the source distribution, and dedicated Linux x86_64 plus Windows amd64 CUDA wheels for CPython 3.10 through 3.14. The GPU wheel filenames carry a 1gpu build tag so the release can publish CPU and GPU artifacts for the same Python and platform tags without filename collisions.

The GPU release jobs install the CUDA toolkit in CI, export the toolkit paths into the build environment, and set CTBOOST_REQUIRE_CUDA=ON so the wheel build fails instead of silently degrading to a CPU-only artifact. The release smoke test also checks that ctboost.build_info()["cuda_enabled"] is True before the GPU wheel is uploaded.

Kaggle GPU Install

pip install ctboost still resolves to the CPU wheel on PyPI. On Kaggle, install the matching GPU release wheel from GitHub instead:

import json
import subprocess
import sys
import urllib.request

tag = "v0.1.39"
py_tag = f"cp{sys.version_info.major}{sys.version_info.minor}"
api_url = f"https://api.github.com/repos/captnmarkus/ctboost/releases/tags/{tag}"

with urllib.request.urlopen(api_url) as response:
    release = json.load(response)

asset = next(
    item
    for item in release["assets"]
    if item["name"].endswith(".whl") and f"-1gpu-{py_tag}-{py_tag}-" in item["name"]
)

subprocess.check_call(
    [sys.executable, "-m", "pip", "install", "-U", asset["browser_download_url"]]
)

After installation, confirm the wheel really contains CUDA support:

import ctboost

info = ctboost.build_info()
if not info["cuda_enabled"]:
    raise RuntimeError(f"Expected a CUDA-enabled CTBoost wheel, got: {info}")
print(info)

CPU-Only Source Build

To force a CPU-only native build:

CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF" pip install .

On PowerShell:

$env:CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF"
pip install .

Windows source builds require a working C++ toolchain. In practice that means Visual Studio Build Tools 2022 or a compatible MSVC environment, plus CMake. ninja is recommended, but it does not replace the compiler itself.

CUDA Source Build

CTBoost can compile a CUDA backend when the CUDA toolkit and compiler are available. CUDA is enabled by default in CMake, but the build automatically falls back to CPU-only when no toolkit is detected.

pip install .

You can inspect the compiled package after installation:

import ctboost
print(ctboost.build_info())

Quick Start

scikit-learn Style Classification

import pandas as pd
from sklearn.datasets import make_classification

from ctboost import CTBoostClassifier

X, y = make_classification(
    n_samples=256,
    n_features=8,
    n_informative=5,
    n_redundant=0,
    random_state=13,
).astype("float32")
X = pd.DataFrame(X, columns=[f"f{i}" for i in range(X.shape[1])])
X["segment"] = pd.Categorical(["a" if i % 2 == 0 else "b" for i in range(len(X))])
y = y.astype("float32")

model = CTBoostClassifier(
    iterations=256,
    learning_rate=0.1,
    max_depth=3,
    alpha=1.0,
    lambda_l2=1.0,
    task_type="CPU",
)

model.fit(
    X.iloc[:200],
    y[:200],
    eval_set=[(X.iloc[200:], y[200:])],
    early_stopping_rounds=20,
)
proba = model.predict_proba(X)
pred = model.predict(X)
importance = model.feature_importances_
best_iteration = model.best_iteration_

Low-Level Training API

import numpy as np

import ctboost

X = np.array([[0.0, 1.0], [1.0, 0.0], [0.5, 0.5]], dtype=np.float32)
y = np.array([0.0, 1.0, 0.5], dtype=np.float32)

pool = ctboost.Pool(X, y)
booster = ctboost.train(
    pool,
    {
        "objective": "Huber",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 64,
        "huber_delta": 1.5,
        "eval_metric": "MAE",
        "nan_mode": "Min",
        "task_type": "CPU",
    },
    num_boost_round=10,
)

predictions = booster.predict(pool)
loss_history = booster.loss_history
eval_loss_history = booster.eval_loss_history
exported_borders = booster.get_borders()

Per-feature quantization controls are available on the same low-level API:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 128,
        "max_bin_by_feature": {0: 16, 1: 8},
        "border_selection_method": "Uniform",
        "feature_borders": {1: [-0.5, 0.0, 0.5]},
        "nan_mode_by_feature": {0: "Max"},
    },
    num_boost_round=32,
)

feature_borders lets selected numeric features reuse explicit cut values, max_bin_by_feature overrides the global max_bins budget per column, border_selection_method currently supports Quantile and Uniform, and Booster.get_borders() returns an importable border bundle keyed by fitted feature index.

The same low-level API also exposes generic regularization and growth controls around the existing conditional tree learner:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 4,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "bootstrap_type": "Bayesian",
        "bagging_temperature": 1.0,
        "feature_weights": {0: 2.0, 3: 0.5},
        "first_feature_use_penalties": {2: 1.5},
        "random_strength": 0.2,
        "grow_policy": "LeafWise",
        "max_leaves": 16,
        "min_samples_split": 8,
        "max_leaf_weight": 2.0,
    },
    num_boost_round=64,
)

feature_weights rescales feature preference without replacing the conditional test, first_feature_use_penalties discourages the first use of selected features at the model level, random_strength adds seeded noise to break near-ties in split gain after the conditional gate has already accepted a candidate, and grow_policy="LeafWise" currently means a best-child-first heuristic under the existing max_leaves budget rather than a separate split criterion.

The same low-level API can now prepare raw categorical/text/embedding inputs directly:

import numpy as np
import ctboost

X = np.empty((4, 4), dtype=object)
X[:, 0] = ["berlin", "paris", "berlin", "rome"]
X[:, 1] = [1.0, 2.0, 1.5, 3.0]
X[:, 2] = ["red fox", "blue fox", "red hare", "green fox"]
X[:, 3] = [
    np.array([0.1, 0.4, 0.2], dtype=np.float32),
    np.array([0.7, 0.1, 0.3], dtype=np.float32),
    np.array([0.2, 0.5, 0.6], dtype=np.float32),
    np.array([0.9, 0.2, 0.4], dtype=np.float32),
]
y = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

booster = ctboost.train(
    X,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "one_hot_max_size": 4,
        "max_cat_threshold": 16,
        "cat_features": [0],
        "simple_ctr": ["Mean", "Frequency"],
        "per_feature_ctr": {0: ["Mean"]},
        "text_features": [2],
        "embedding_features": [3],
    },
    label=y,
    num_boost_round=32,
)

raw_predictions = booster.predict(X)

If you want to reuse the raw-data preparation work across repeated fits on the same split, prepare it once and then train against the prepared bundle:

prepared = ctboost.prepare_training_data(
    X_train,
    {
        "objective": "RMSE",
        "ordered_ctr": True,
        "cat_features": [0],
        "text_features": [2],
    },
    label=y_train,
    eval_set=[(X_valid, y_valid)],
    eval_names=["holdout"],
)

booster = ctboost.train(
    prepared,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "cat_features": [0],
        "text_features": [2],
    },
    num_boost_round=64,
    early_stopping_rounds=10,
)

For disk-backed pool staging on large folds:

pool = ctboost.prepare_pool(
    X_numeric,
    y,
    external_memory=True,
    external_memory_dir="ctboost-cache",
)

booster = ctboost.train(
    X_numeric,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "external_memory": True,
        "external_memory_dir": "ctboost-cache",
    },
    label=y,
    num_boost_round=64,
)

Ranking pools can also carry richer query metadata without changing the underlying conditional tree learner:

group_id = np.array([0, 0, 0, 1, 1, 1], dtype=np.int64)
subgroup_id = np.array([0, 0, 1, 0, 1, 2], dtype=np.int64)
group_weight = np.array([2.0, 2.0, 2.0, 1.0, 1.0, 1.0], dtype=np.float32)
pairs = np.array([[0, 2], [3, 5]], dtype=np.int64)
pairs_weight = np.array([1.5, 0.5], dtype=np.float32)
baseline = np.zeros(group_id.shape[0], dtype=np.float32)

rank_pool = ctboost.Pool(
    X_rank,
    y_rank,
    group_id=group_id,
    group_weight=group_weight,
    subgroup_id=subgroup_id,
    pairs=pairs,
    pairs_weight=pairs_weight,
    baseline=baseline,
)

Working With Categorical Features

Categorical columns can still be marked manually through the Pool API:

import numpy as np
import ctboost

X = np.array([[0.0], [1.0], [2.0], [3.0]], dtype=np.float32)
y = np.array([1.0, 0.0, 1.0, 0.0], dtype=np.float32)

pool = ctboost.Pool(X, y, cat_features=[0])

For pandas inputs, categorical/object columns are detected automatically:

import pandas as pd
import ctboost

frame = pd.DataFrame(
    {
        "value": [1.0, 2.0, 3.0, 4.0],
        "city": pd.Categorical(["berlin", "paris", "berlin", "rome"]),
        "segment": ["retail", "enterprise", "retail", "enterprise"],
    }
)
label = pd.Series([0.0, 1.0, 0.0, 1.0], dtype="float32")

pool = ctboost.Pool(frame, label)
assert pool.cat_features == [1, 2]

For estimator-side ordered CTRs, categorical crosses, one-hot expansion, rare-category bucketing, text hashing, and embedding expansion, use the Python feature pipeline parameters:

import numpy as np
import pandas as pd

from ctboost import CTBoostRegressor

frame = pd.DataFrame(
    {
        "city": ["berlin", "paris", "berlin", "rome"],
        "headline": ["red fox", "blue fox", "red hare", "green fox"],
        "embedding": [
            np.array([0.1, 0.4, 0.2], dtype=np.float32),
            np.array([0.7, 0.1, 0.3], dtype=np.float32),
            np.array([0.2, 0.5, 0.6], dtype=np.float32),
            np.array([0.9, 0.2, 0.4], dtype=np.float32),
        ],
        "value": [1.0, 2.0, 1.5, 3.0],
    }
)
label = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

model = CTBoostRegressor(
    iterations=32,
    learning_rate=0.1,
    max_depth=3,
    ordered_ctr=True,
    one_hot_max_size=8,
    max_cat_threshold=32,
    cat_features=["city"],
    categorical_combinations=[["city", "headline"]],
    simple_ctr=["Mean", "Frequency"],
    per_feature_ctr={"city": ["Mean"]},
    text_features=["headline"],
    embedding_features=["embedding"],
)
model.fit(frame, label)

one_hot_max_size keeps low-cardinality categoricals as explicit indicator columns, max_cat_threshold buckets higher-cardinality levels down to a capped native categorical domain before the conditional tree learner sees them, and per_feature_ctr lets specific base features or categorical combinations opt into CTR generation without changing the underlying conditional split logic.

Model Persistence, Warm Start, And Cross-Validation

import ctboost

booster.save_model("regression-model.json")
restored = ctboost.load_model("regression-model.json")
restored_predictions = restored.predict(pool)
booster.export_model("standalone_predictor.py", export_format="python")

continued = ctboost.train(
    pool,
    {"objective": "RMSE", "learning_rate": 0.2, "max_depth": 2, "alpha": 1.0, "lambda_l2": 1.0},
    num_boost_round=10,
    init_model=restored,
)

cv_result = ctboost.cv(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
    },
    num_boost_round=25,
    nfold=3,
)

The scikit-learn compatible estimators also expose:

  • save_model(...)
  • export_model(..., export_format="python") for standalone numeric or already-prepared deployment scoring
  • load_model(...)
  • staged_predict(...)
  • staged_predict_proba(...) for classifiers
  • predict_leaf_index(...)
  • predict_contrib(...)
  • evals_result_
  • best_score_
  • sample_weight and baseline on fit(...)
  • group_weight, subgroup_id, pairs, and pairs_weight on CTBoostRanker.fit(...)
  • class_weight, scale_pos_weight, eval_metric, nan_mode, nan_mode_by_feature, and warm_start
  • max_bins, max_bin_by_feature, border_selection_method, and feature_borders
  • bagging_temperature, feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight

Public Python API

The main entry points are:

  • ctboost.Pool
  • ctboost.FeaturePipeline
  • ctboost.PreparedTrainingData
  • ctboost.prepare_pool
  • ctboost.prepare_training_data
  • ctboost.train
  • ctboost.cv
  • ctboost.Booster
  • ctboost.CTBoostClassifier
  • ctboost.CTBoostRanker
  • ctboost.CTBoostRegressor
  • ctboost.CBoostClassifier
  • ctboost.CBoostRanker
  • ctboost.CBoostRegressor
  • ctboost.build_info
  • ctboost.load_model

Build and Test

Run the test suite:

pytest tests

The latest local release-candidate validation on April 18, 2026 was:

python -m pytest tests/test_data_and_loss.py tests/test_ranking.py tests/test_explainability_and_warm_start.py -q
python -m pytest tests/test_sklearn.py tests/test_metrics_and_objectives.py tests/test_persistence_and_cv.py -q --basetemp=.pytest-tmp

Build an sdist:

python -m build --sdist

Configure and build the native extension directly with CMake:

python -m pip install pybind11 numpy pandas scikit-learn pytest
cmake -S . -B build -DCTBOOST_ENABLE_CUDA=OFF -Dpybind11_DIR="$(python -m pybind11 --cmakedir)"
cmake --build build --config Release --parallel

Wheel builds are configured through cibuildwheel for:

  • Windows amd64
  • Linux x86_64 and aarch64 using the current manylinux baseline
  • macOS x86_64
  • CPython 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, and 3.14

GitHub Actions workflows:

  • .github/workflows/cmake.yml: configures, builds, installs, and tests CPU builds on Ubuntu, Windows, and macOS for pushes and pull requests
  • .github/workflows/publish.yml: builds release wheels and the sdist, runs wheel smoke tests on built artifacts, publishes CPU wheels to PyPI, and attaches both CPU and Linux/Windows GPU wheels to tagged GitHub releases

The standard PyPI release wheel workflow builds CPU-only wheels by setting:

cmake.define.CTBOOST_ENABLE_CUDA=OFF

The GPU release-wheel matrices enable CUDA separately with:

cmake.define.CTBOOST_ENABLE_CUDA=ON
cmake.define.CTBOOST_REQUIRE_CUDA=ON
cmake.define.CMAKE_CUDA_COMPILER=/usr/local/cuda-12.0/bin/nvcc
cmake.define.CUDAToolkit_ROOT=/usr/local/cuda-12.0
cmake.define.CMAKE_CUDA_ARCHITECTURES=60;70;75;80;86;89
wheel.build-tag=1gpu

Project Layout

ctboost/      Python API layer
include/      public C++ headers
src/core/     core boosting, objectives, trees, statistics
src/bindings/ pybind11 extension bindings
cuda/         optional CUDA backend
tests/        Python test suite

License

Apache 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctboost-0.1.39.tar.gz (318.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ctboost-0.1.39-cp314-cp314-win_amd64.whl (491.1 kB view details)

Uploaded CPython 3.14Windows x86-64

ctboost-0.1.39-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (679.1 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (609.8 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp314-cp314-macosx_10_15_x86_64.whl (527.4 kB view details)

Uploaded CPython 3.14macOS 10.15+ x86-64

ctboost-0.1.39-cp313-cp313-win_amd64.whl (478.1 kB view details)

Uploaded CPython 3.13Windows x86-64

ctboost-0.1.39-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (678.7 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (608.9 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp313-cp313-macosx_10_15_x86_64.whl (527.1 kB view details)

Uploaded CPython 3.13macOS 10.15+ x86-64

ctboost-0.1.39-cp312-cp312-win_amd64.whl (478.1 kB view details)

Uploaded CPython 3.12Windows x86-64

ctboost-0.1.39-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (679.1 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (609.2 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp312-cp312-macosx_10_15_x86_64.whl (527.1 kB view details)

Uploaded CPython 3.12macOS 10.15+ x86-64

ctboost-0.1.39-cp311-cp311-win_amd64.whl (477.6 kB view details)

Uploaded CPython 3.11Windows x86-64

ctboost-0.1.39-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (676.2 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (608.1 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp311-cp311-macosx_10_15_x86_64.whl (524.5 kB view details)

Uploaded CPython 3.11macOS 10.15+ x86-64

ctboost-0.1.39-cp310-cp310-win_amd64.whl (476.7 kB view details)

Uploaded CPython 3.10Windows x86-64

ctboost-0.1.39-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (674.0 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (606.5 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp310-cp310-macosx_10_15_x86_64.whl (522.8 kB view details)

Uploaded CPython 3.10macOS 10.15+ x86-64

ctboost-0.1.39-cp39-cp39-win_amd64.whl (482.7 kB view details)

Uploaded CPython 3.9Windows x86-64

ctboost-0.1.39-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (672.5 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (605.2 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.39-cp38-cp38-win_amd64.whl (476.6 kB view details)

Uploaded CPython 3.8Windows x86-64

ctboost-0.1.39-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (672.4 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.39-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (604.8 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

File details

Details for the file ctboost-0.1.39.tar.gz.

File metadata

  • Download URL: ctboost-0.1.39.tar.gz
  • Upload date:
  • Size: 318.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39.tar.gz
Algorithm Hash digest
SHA256 051e39b06470adf5f90c98ff040419a59e81d86576289f090683e844c7f9b7ff
MD5 e2261823c03b9addb1827e86a35c054e
BLAKE2b-256 e60a525385cd286fbe342066eb4f4e0175d845217cea193eec30cab9553f7702

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39.tar.gz:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 491.1 kB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 4a9ca527ed5a3cf64306620a9c66180b80a52bb82002c2c7bbe5918dd06f2224
MD5 0423a2da38f4363b7432cc732a91c0c5
BLAKE2b-256 78292d7868179ea6b0bce7ce1994830e06230a79140f2ac4704aeace87cd47f6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp314-cp314-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ca2866ce3e617ef449fcb6bb78c2ba63b3ef3f3e50489ab634a03ee3eed51542
MD5 8f8858be67465084c17d23cf5b73a2cd
BLAKE2b-256 1cfde851f15dfed667e8818dd3059d40d487529371ff22a0cd6a73925649e77c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 a82ff50af8d6913f19c7e5c1897a37622292801729e4d596607173147e2ce8a8
MD5 94b628c1be483c04bb70c7c8e3d5f909
BLAKE2b-256 f51a058f4eba534b6bb557a5d293c0a3640386e742df6d36673b5d274c754175

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp314-cp314-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp314-cp314-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 c2189e3fb262303e5955930ad5477f0abb6ebf84bc126c32f2e5b7cf6396206c
MD5 f3fdde7a0cfcc4ffcd523ba611673d73
BLAKE2b-256 00d18630af1c683b65d4c88e03ab8953b43f1c2a8695014ff78a087e49f1c4af

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp314-cp314-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 478.1 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 2ea90d6d3c5acf6598b5bea702a69484c44c2093785f32d931593f74b54e4830
MD5 15b324d60efef1049b5245ed344fdb2d
BLAKE2b-256 3d3b839b96dd5ca8eb551d580da3c439b72ef771db8cd0bf11cd2b2071b943aa

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 40c5787458c3faabba8d92b3a6f9426ddfd0d628549705899615571d893cf962
MD5 9cb3bf7648f080e6e480ce83a3115ba2
BLAKE2b-256 d82573c00204c046d0823154103f2556a9b0d24fc53df8024d82959741d742c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 2f651a53925085f0d4c03540b7626f19c2a03de716cf731fd4e483e1efd4d900
MD5 701c1fa5cda0512d9833eb4398473e7e
BLAKE2b-256 ddcee0bec289517d346babc2f42ddb08d9c09ebe513f6067631543d845e81006

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp313-cp313-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp313-cp313-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 a735213ec38e6842f79616a1df2ff8a66292336d408bc80eaa00538412669dca
MD5 f6c7bb7d90ea2a917566e5d12f8b5fbb
BLAKE2b-256 83bae54468fadf4b35eb8e72cafa8f6223d0167152596452b006194fe5f9fe64

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp313-cp313-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 478.1 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 b906b8e2115c1c4c33e58fbc9998d86cb13166515efe2a55831428a083c585d6
MD5 839c65db67727b3368b94c10bf688586
BLAKE2b-256 01d8ddbd70ce1fc3703984f78342a90ac04b363b76a6a9af5d5c925f4baefbd7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 29b19e580b848b7784f1ffa5874aa09179c635638f8b608fcd6f4c6478f5619f
MD5 60f945791705156d099c6c722dfe9371
BLAKE2b-256 1ca91842cf0a45542a4cee3539c88ed77399576520432cb94139f8fb57944341

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 eab371521b5fe0b538d4e31ed41f76bb1b1dae4ead16c71304e0f44e4c0d39f3
MD5 ad7e047bafc2996d9acd63aa7da216e6
BLAKE2b-256 3aea7899eefc89a9a057c26a24c54e1e480193a769d1d56eebaaf95c100b97e5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp312-cp312-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp312-cp312-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 1763ffa0e5158cf96075e2fab6aab58c50d7fcefb163efd21fa41a288f95ece0
MD5 bbccf588d7c07f96754bea869ddbff86
BLAKE2b-256 ac1317b36dca9bfa630f6da0337dd081b1abd370c898b3f8f69946e229976eab

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp312-cp312-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 477.6 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 3d670665808842514150013aa944751f490ece0bac9a2a228bdbb2014ebdc4dd
MD5 484c9e1ab925f60efefa9dd2b4a32a5f
BLAKE2b-256 9d4766c779d2cfc24c846ac8b229105461832ab18db4d598c3a391581791c80e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2a6736e3ddd555fd1ca60bf22f23ea1fd4908b9011ffa2f9d4945d0a220a0b08
MD5 64d7365607a4119f1ecaca4a1550f93f
BLAKE2b-256 7f3ace00e6880a3d7fe004ec76f8321582be7adf353c0aaeb0f6e7a9937f4cb9

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e302d37c94eb02588d90271d1ce8b4898db1918d022f51b2680afa7c7b6094e5
MD5 b1563fbb4e51b8a66b5f6e80443bdd67
BLAKE2b-256 0aa923cec9c8a43174b47985d1acc4ea7fc4b3e37645ce180c8c951a46fe99a5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp311-cp311-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp311-cp311-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 575ab0ff89ef81144bdd4c33b2125f2eb88e757b6c45b58096cc3d79b484271a
MD5 3fae7eb3117427f974814bb35a451cd8
BLAKE2b-256 5bffd47e6e32443db9c6ff342dacffaa0cbe10c86b40d5a564e33d969fba0479

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp311-cp311-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 476.7 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 6f66c4dc23cf5ded85546cd63291b64cfd214f4017a4b6f81ef7da147842858a
MD5 bcc45ee42b4f18b086a0e2cc49d716f1
BLAKE2b-256 caa45988481711cde270cdeefcb01e29976e79ba4f2c44867d9bce129ecc299a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c7eacf4a2e07233aaf23c68d9afe3cbd9c66fb327145b4355db18546658896a5
MD5 02c7a1781ed465b84e629e070eb4bfd0
BLAKE2b-256 c6099ba8b49afe34792e7d8ec9193f7feb3326c36dd2a1a472201b11d0eb11e9

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 fb2e0cd57ce4a45b25245ae476fefb0ea67dda7a36b1dca122647d2bf184a5f4
MD5 a41c57c891802711cdfba6c1a1954cc7
BLAKE2b-256 be56b212a8514367ee0b115284ff77ec9be942668cd0530a7d8e70e4eb93a205

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp310-cp310-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp310-cp310-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 bc9facbe8dd956582943871afa1f84b94067b0021041d488037cd04f44e02366
MD5 4d8e9f6d01565229ec15ac0b0a72d8e3
BLAKE2b-256 1a79b14b6cda39b211a0ad6dd230d71ad15748f08406917ca88fb138d3208a9e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp310-cp310-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 482.7 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 2e684dc47431666c34bb74933241a218a9ada343d9ddc7993ed10a10a70a65f9
MD5 062399208797d710f5857b96ed96d853
BLAKE2b-256 67cfe3350a06f9a9706049213602414ed27c3dafb98f1f5adb266e56febe2ef2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp39-cp39-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 b0e24d5cf8b8991af22b0bf4c7d28c657b8658faf2bf501d7395cd9dddec7c29
MD5 fd0cfbcb01f5a68005894eb5f7138f17
BLAKE2b-256 30571fcf0ffedba2625cde6043c4129ae3de03ec82f6136e227cf6bc7aaf5a49

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 514a78ed2f2112756fd3d73fcffd7019ab8fb3ff2ab07ba13008231ee2bc8555
MD5 182d0ada497bd51069643738f966dcbd
BLAKE2b-256 05cd9f409cb578f142695259b45f3b2f6065b8faf51a960257b19d8526bedcb5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.39-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 476.6 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.39-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 3c67078f8c8a7ac00db49e5ed97c3d4b8ffd4f518e777a0f597ad1502d802fb1
MD5 b481c5332cf82f1f1c32a1c4d13ec9b1
BLAKE2b-256 1c11f61ca3efa9f980446d7c5d87a1a008bb563b132d508e314fb00e12e66b8e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp38-cp38-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 849aad36ac91d446a568ec3902f279b30a789b187f8736387ba0991299954608
MD5 5b9b4d7c14c36ee162e231f376016981
BLAKE2b-256 68ffbeb63ce69b587ae53dc22c9fb2c96d917df91a541c2899736f7c3c0bbd2e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.39-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.39-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 be42c38feead30570292426d1a082d0ae09827d276dd847e19a20b782e6377df
MD5 ceed4b144b95061a070a55ef49ac68ec
BLAKE2b-256 f9b654ffa861e1a2c5dd69b4a60a52e3dd106f137f859a62e8111ade76abdc3d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.39-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page