Skip to main content

A GPU-accelerated gradient boosting library using Conditional Inference Trees.

Project description

CTBoost

CTBoost is a gradient boosting library built around Conditional Inference Trees, with a native C++17 core, Python bindings via pybind11, optional CUDA support for source builds, and an optional scikit-learn style API.

The current codebase supports end-to-end training and prediction for regression, classification, grouped ranking, and survival, plus pandas and SciPy sparse ingestion without dense expansion, row weights and class imbalance controls, explicit missing-value handling, configurable validation metrics, stable JSON model persistence, standalone Python export for prepared numeric models, staged prediction, warm-start continuation, a native C++ feature pipeline for categorical/text/embedding transforms with thin Python wrappers, reusable prepared training-data bundles, and a built-in cross-validation helper.

Current Status

  • Language mix: Python + C++17, with optional CUDA
  • Python support: 3.8 through 3.14
  • Packaging: scikit-build-core
  • CI/CD: GitHub Actions for CMake validation and cibuildwheel release builds
  • Repository version: 0.1.38
  • Status: actively evolving native + Python package

What Works Today

  • Native gradient boosting backend exposed as ctboost._core
  • Pool abstraction for dense tabular data, SciPy sparse input, categorical feature indices, and optional group_id
  • Native pandas DataFrame and Series support
  • Automatic categorical detection for pandas category and object columns
  • Regression training with ctboost.train(...), including raw array/DataFrame inputs plus optional preprocessing and external-memory staging
  • scikit-learn compatible CTBoostClassifier, CTBoostRegressor, and CTBoostRanker when scikit-learn is installed
  • Binary and multiclass classification
  • Grouped ranking with PairLogit and NDCG
  • Row weights through Pool(..., weight=...) and sample_weight on sklearn estimators
  • Class imbalance controls through class_weight, class_weights, auto_class_weights="balanced", and scale_pos_weight
  • Explicit missing-value handling through nan_mode
  • Quantization controls through max_bins, max_bin_by_feature, border_selection_method, feature_borders, and nan_mode_by_feature
  • Row subsampling through subsample plus bootstrap_type="No"|"Bernoulli"|"Poisson"
  • Bayesian bagging through bootstrap_type="Bayesian" plus bagging_temperature
  • boosting_type="RandomForest" on top of the existing conditional-inference tree learner
  • boosting_type="DART" with dropout-style tree normalization on top of the existing conditional-inference tree learner
  • Monotonic constraints through monotone_constraints
  • Path-level interaction constraints through interaction_constraints
  • Additional generic regularization and tree-growth controls through feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight
  • GPU tree growth now also supports monotonic constraints, interaction constraints, feature_weights, first_feature_use_penalties, random_strength, and grow_policy="LeafWise" without replacing the conditional-inference split gate
  • Survival objectives: Cox, SurvivalExponential
  • Survival evaluation through CIndex
  • Early stopping with eval_set, eval_names, early_stopping_rounds, early_stopping_metric, and early_stopping_name
  • Single- and multi-watchlist evaluation through one or many eval_set entries
  • Single- and multi-metric evaluation through string or sequence eval_metric values
  • Per-iteration callback hooks through callbacks, plus built-in ctboost.log_evaluation(...) and ctboost.checkpoint_callback(...)
  • Validation loss/metric history and evals_result_
  • Per-iteration prediction through staged prediction and num_iteration
  • Stable JSON and pickle model persistence for low-level boosters and scikit-learn style estimators
  • Cross-validation with ctboost.cv(...) when scikit-learn is installed
  • Regression objectives: RMSE, MAE, Huber, Quantile, Poisson, Tweedie
  • Generic eval metrics including RMSE, MAE, Poisson, Tweedie, Accuracy, BalancedAccuracy, Precision, Recall, F1, AUC, NDCG, MAP, MRR, and CIndex
  • Native ctboost.FeaturePipeline logic in _core.NativeFeaturePipeline, with low-level and sklearn integration for ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding-stat expansion
  • Generic categorical controls around the existing conditional tree learner: one_hot_max_size / max_cat_to_onehot, max_cat_threshold, simple_ctr, combinations_ctr, and per_feature_ctr
  • ctboost.prepare_pool(...) for low-level raw-data preparation, optional feature-pipeline fitting, and disk-backed external-memory pool staging
  • ctboost.prepare_training_data(...) plus PreparedTrainingData for one-time raw train/eval preparation that can be reused across repeated fits
  • Native CPU out-of-core fit through ctboost.train(..., external_memory=True), which now spills quantized feature-bin columns to disk instead of keeping the full histogram matrix resident in RAM
  • Multi-host distributed training through distributed_world_size, distributed_rank, distributed_root, and distributed_run_id, with a native per-node histogram reduction path and a TCP collective backend available through distributed_root="tcp://host:port"
  • Distributed eval_set, multi-watchlist or multi-metric evaluation, callbacks, early_stopping_rounds, init_model, grouped ranking shards, and sklearn-estimator wrappers on the TCP collective backend
  • Filesystem-backed distributed runs now also fall back to a rank-0 coordinator path for advanced eval, callback, ranking, and GPU compatibility flows when TCP is not configured
  • Distributed GPU training when CUDA is available and distributed_root uses the TCP collective backend
  • Distributed raw-data feature-pipeline fitting across ranks for native categorical, text, and embedding preprocessing
  • Feature importance reporting
  • Leaf-index introspection and path-based prediction contributions
  • Continued training through init_model and estimator warm_start
  • Standalone pure-Python deployment export through Booster.export_model(..., export_format="python") and matching sklearn-estimator wrappers for numeric or already-prepared features
  • Build metadata reporting through ctboost.build_info()
  • CPU builds on standard CI runners
  • Optional CUDA compilation when building from source with a suitable toolkit
  • GPU source builds now keep fit-scoped histogram data resident on device, support shared-memory histogram accumulation, and expose GPU raw-score prediction for regression, binary classification, and multiclass models
  • Histogram building now writes directly into final-width compact storage when the fitted schema permits <=256 bins, avoiding the old transient uint16 -> uint8 duplication spike
  • Fitted models now store quantization metadata once per booster instead of duplicating the same schema in every tree
  • Low-level boosters can export reusable fitted borders through Booster.get_borders() and expose the full shared quantization schema through Booster.get_quantization_schema()
  • GPU fit now drops the host training histogram bin matrix immediately after the device histogram workspace has been created and warm-start predictions have been seeded
  • GPU tree building now uses histogram subtraction in the device path as well, so only one child histogram is built explicitly after each split
  • GPU node search now keeps best-feature selection on device and returns a compact winner instead of copying the full per-feature search buffer back to host each node
  • Training can emit native histogram/tree timing via verbose=True or CTBOOST_PROFILE=1

Current Limitations

  • Ordered CTRs, frequency-style CTRs, categorical crosses, low-cardinality one-hot expansion, rare-category bucketing, text hashing, and embedding expansion now run through a native C++ pipeline, while pandas extraction, raw-data routing, and Pool orchestration remain thin Python glue; ctboost.prepare_training_data(...) reduces that repeated Python work when you need to fit multiple times on the same raw train/eval split
  • There is now a native sparse training path plus disk-backed quantized-bin staging through ctboost.train(..., external_memory=True) on both CPU and GPU, and distributed training can also use a standalone TCP collective coordinator through distributed_root="tcp://host:port"
  • The legacy filesystem-based distributed path still exists for the native shard-reduction path; advanced eval, callback, ranking, and GPU compatibility workflows now fall back to a rank-0 coordinator path, while the TCP backend remains the true multi-rank path for those features
  • Distributed grouped/ranking training requires each group_id to live entirely on one worker shard; cross-rank query groups are rejected
  • Dedicated GPU wheel automation now targets Linux x86_64 and Windows amd64 CPython 3.10 through 3.14 release assets
  • CUDA wheel builds in CI depend on container-side toolkit provisioning

Resolved Fold-Memory Hotspots

The older v0.1.15 GPU fit-memory bottleneck list is now closed in the current tree:

  • Quantization metadata is stored once per fitted booster and shared by all trees instead of being duplicated per tree
  • GPU fit releases the host training histogram bin matrix immediately after device workspace creation and warm-start seeding
  • GPU tree growth uses histogram subtraction, so only one child histogram is built explicitly after a split
  • GPU split search keeps best-feature selection on device and copies back only the winning feature summary

That means the old per-node GPU bin-materialization issue is no longer the main resident-memory problem in the current codebase. The remaining generic backlog is now in broader distributed runtime ergonomics and additional export or deployment tooling.

Benchmark Snapshot

The heavy ordered-target-encoding playground-series-s6e4 replay was last measured on April 12, 2026 with the v0.1.11 source tree. The one-fold Kaggle source-build replay completed successfully with:

  • build 55.41s
  • fold preprocess 57.17s
  • fold fit 2107.10s
  • fold predict 5.89s
  • fold total 2170.17s
  • validation score 0.973213

Since that replay, the source tree has removed additional fit-memory overhead by sharing quantization schema per model, building compact train bins without a second host copy, releasing host train-bin storage after GPU upload, and adding GPU histogram subtraction plus device-side best-feature reduction.

Installation

For local development or source builds:

pip install .

Install development dependencies:

pip install -e .[dev]

Install the optional scikit-learn wrappers and ctboost.cv(...) support:

pip install -e .[sklearn]

Wheels vs Source Builds

pip install ctboost works without a compiler only when PyPI has a prebuilt wheel for your exact Python/OS tag. If no matching wheel exists, pip falls back to the source distribution and has to compile the native extension locally.

The release workflow is configured to publish CPU wheels for current CPython releases on Windows and Linux, plus macOS x86_64 CPU wheels for CPython 3.10 through 3.14, so standard pip install ctboost usage does not depend on a local compiler.

Each tagged GitHub release also attaches the CPU wheels, the source distribution, and dedicated Linux x86_64 plus Windows amd64 CUDA wheels for CPython 3.10 through 3.14. The GPU wheel filenames carry a 1gpu build tag so the release can publish CPU and GPU artifacts for the same Python and platform tags without filename collisions.

The GPU release jobs install the CUDA toolkit in CI, export the toolkit paths into the build environment, and set CTBOOST_REQUIRE_CUDA=ON so the wheel build fails instead of silently degrading to a CPU-only artifact. The release smoke test also checks that ctboost.build_info()["cuda_enabled"] is True before the GPU wheel is uploaded.

Kaggle GPU Install

pip install ctboost still resolves to the CPU wheel on PyPI. On Kaggle, install the matching GPU release wheel from GitHub instead:

import json
import subprocess
import sys
import urllib.request

tag = "v0.1.38"
py_tag = f"cp{sys.version_info.major}{sys.version_info.minor}"
api_url = f"https://api.github.com/repos/captnmarkus/ctboost/releases/tags/{tag}"

with urllib.request.urlopen(api_url) as response:
    release = json.load(response)

asset = next(
    item
    for item in release["assets"]
    if item["name"].endswith(".whl") and f"-1gpu-{py_tag}-{py_tag}-" in item["name"]
)

subprocess.check_call(
    [sys.executable, "-m", "pip", "install", "-U", asset["browser_download_url"]]
)

After installation, confirm the wheel really contains CUDA support:

import ctboost

info = ctboost.build_info()
if not info["cuda_enabled"]:
    raise RuntimeError(f"Expected a CUDA-enabled CTBoost wheel, got: {info}")
print(info)

CPU-Only Source Build

To force a CPU-only native build:

CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF" pip install .

On PowerShell:

$env:CMAKE_ARGS="-DCTBOOST_ENABLE_CUDA=OFF"
pip install .

Windows source builds require a working C++ toolchain. In practice that means Visual Studio Build Tools 2022 or a compatible MSVC environment, plus CMake. ninja is recommended, but it does not replace the compiler itself.

CUDA Source Build

CTBoost can compile a CUDA backend when the CUDA toolkit and compiler are available. CUDA is enabled by default in CMake, but the build automatically falls back to CPU-only when no toolkit is detected.

pip install .

You can inspect the compiled package after installation:

import ctboost
print(ctboost.build_info())

Quick Start

scikit-learn Style Classification

import pandas as pd
from sklearn.datasets import make_classification

from ctboost import CTBoostClassifier

X, y = make_classification(
    n_samples=256,
    n_features=8,
    n_informative=5,
    n_redundant=0,
    random_state=13,
).astype("float32")
X = pd.DataFrame(X, columns=[f"f{i}" for i in range(X.shape[1])])
X["segment"] = pd.Categorical(["a" if i % 2 == 0 else "b" for i in range(len(X))])
y = y.astype("float32")

model = CTBoostClassifier(
    iterations=256,
    learning_rate=0.1,
    max_depth=3,
    alpha=1.0,
    lambda_l2=1.0,
    task_type="CPU",
)

model.fit(
    X.iloc[:200],
    y[:200],
    eval_set=[(X.iloc[200:], y[200:])],
    early_stopping_rounds=20,
)
proba = model.predict_proba(X)
pred = model.predict(X)
importance = model.feature_importances_
best_iteration = model.best_iteration_

Low-Level Training API

import numpy as np

import ctboost

X = np.array([[0.0, 1.0], [1.0, 0.0], [0.5, 0.5]], dtype=np.float32)
y = np.array([0.0, 1.0, 0.5], dtype=np.float32)

pool = ctboost.Pool(X, y)
booster = ctboost.train(
    pool,
    {
        "objective": "Huber",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 64,
        "huber_delta": 1.5,
        "eval_metric": "MAE",
        "nan_mode": "Min",
        "task_type": "CPU",
    },
    num_boost_round=10,
)

predictions = booster.predict(pool)
loss_history = booster.loss_history
eval_loss_history = booster.eval_loss_history
exported_borders = booster.get_borders()

Per-feature quantization controls are available on the same low-level API:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "max_bins": 128,
        "max_bin_by_feature": {0: 16, 1: 8},
        "border_selection_method": "Uniform",
        "feature_borders": {1: [-0.5, 0.0, 0.5]},
        "nan_mode_by_feature": {0: "Max"},
    },
    num_boost_round=32,
)

feature_borders lets selected numeric features reuse explicit cut values, max_bin_by_feature overrides the global max_bins budget per column, border_selection_method currently supports Quantile and Uniform, and Booster.get_borders() returns an importable border bundle keyed by fitted feature index.

The same low-level API also exposes generic regularization and growth controls around the existing conditional tree learner:

booster = ctboost.train(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 4,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "bootstrap_type": "Bayesian",
        "bagging_temperature": 1.0,
        "feature_weights": {0: 2.0, 3: 0.5},
        "first_feature_use_penalties": {2: 1.5},
        "random_strength": 0.2,
        "grow_policy": "LeafWise",
        "max_leaves": 16,
        "min_samples_split": 8,
        "max_leaf_weight": 2.0,
    },
    num_boost_round=64,
)

feature_weights rescales feature preference without replacing the conditional test, first_feature_use_penalties discourages the first use of selected features at the model level, random_strength adds seeded noise to break near-ties in split gain after the conditional gate has already accepted a candidate, and grow_policy="LeafWise" currently means a best-child-first heuristic under the existing max_leaves budget rather than a separate split criterion.

The same low-level API can now prepare raw categorical/text/embedding inputs directly:

import numpy as np
import ctboost

X = np.empty((4, 4), dtype=object)
X[:, 0] = ["berlin", "paris", "berlin", "rome"]
X[:, 1] = [1.0, 2.0, 1.5, 3.0]
X[:, 2] = ["red fox", "blue fox", "red hare", "green fox"]
X[:, 3] = [
    np.array([0.1, 0.4, 0.2], dtype=np.float32),
    np.array([0.7, 0.1, 0.3], dtype=np.float32),
    np.array([0.2, 0.5, 0.6], dtype=np.float32),
    np.array([0.9, 0.2, 0.4], dtype=np.float32),
]
y = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

booster = ctboost.train(
    X,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "one_hot_max_size": 4,
        "max_cat_threshold": 16,
        "cat_features": [0],
        "simple_ctr": ["Mean", "Frequency"],
        "per_feature_ctr": {0: ["Mean"]},
        "text_features": [2],
        "embedding_features": [3],
    },
    label=y,
    num_boost_round=32,
)

raw_predictions = booster.predict(X)

If you want to reuse the raw-data preparation work across repeated fits on the same split, prepare it once and then train against the prepared bundle:

prepared = ctboost.prepare_training_data(
    X_train,
    {
        "objective": "RMSE",
        "ordered_ctr": True,
        "cat_features": [0],
        "text_features": [2],
    },
    label=y_train,
    eval_set=[(X_valid, y_valid)],
    eval_names=["holdout"],
)

booster = ctboost.train(
    prepared,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "ordered_ctr": True,
        "cat_features": [0],
        "text_features": [2],
    },
    num_boost_round=64,
    early_stopping_rounds=10,
)

For disk-backed pool staging on large folds:

pool = ctboost.prepare_pool(
    X_numeric,
    y,
    external_memory=True,
    external_memory_dir="ctboost-cache",
)

booster = ctboost.train(
    X_numeric,
    {
        "objective": "RMSE",
        "learning_rate": 0.1,
        "max_depth": 3,
        "alpha": 1.0,
        "lambda_l2": 1.0,
        "external_memory": True,
        "external_memory_dir": "ctboost-cache",
    },
    label=y,
    num_boost_round=64,
)

Working With Categorical Features

Categorical columns can still be marked manually through the Pool API:

import numpy as np
import ctboost

X = np.array([[0.0], [1.0], [2.0], [3.0]], dtype=np.float32)
y = np.array([1.0, 0.0, 1.0, 0.0], dtype=np.float32)

pool = ctboost.Pool(X, y, cat_features=[0])

For pandas inputs, categorical/object columns are detected automatically:

import pandas as pd
import ctboost

frame = pd.DataFrame(
    {
        "value": [1.0, 2.0, 3.0, 4.0],
        "city": pd.Categorical(["berlin", "paris", "berlin", "rome"]),
        "segment": ["retail", "enterprise", "retail", "enterprise"],
    }
)
label = pd.Series([0.0, 1.0, 0.0, 1.0], dtype="float32")

pool = ctboost.Pool(frame, label)
assert pool.cat_features == [1, 2]

For estimator-side ordered CTRs, categorical crosses, one-hot expansion, rare-category bucketing, text hashing, and embedding expansion, use the Python feature pipeline parameters:

import numpy as np
import pandas as pd

from ctboost import CTBoostRegressor

frame = pd.DataFrame(
    {
        "city": ["berlin", "paris", "berlin", "rome"],
        "headline": ["red fox", "blue fox", "red hare", "green fox"],
        "embedding": [
            np.array([0.1, 0.4, 0.2], dtype=np.float32),
            np.array([0.7, 0.1, 0.3], dtype=np.float32),
            np.array([0.2, 0.5, 0.6], dtype=np.float32),
            np.array([0.9, 0.2, 0.4], dtype=np.float32),
        ],
        "value": [1.0, 2.0, 1.5, 3.0],
    }
)
label = np.array([0.5, 1.2, 0.7, 1.6], dtype=np.float32)

model = CTBoostRegressor(
    iterations=32,
    learning_rate=0.1,
    max_depth=3,
    ordered_ctr=True,
    one_hot_max_size=8,
    max_cat_threshold=32,
    cat_features=["city"],
    categorical_combinations=[["city", "headline"]],
    simple_ctr=["Mean", "Frequency"],
    per_feature_ctr={"city": ["Mean"]},
    text_features=["headline"],
    embedding_features=["embedding"],
)
model.fit(frame, label)

one_hot_max_size keeps low-cardinality categoricals as explicit indicator columns, max_cat_threshold buckets higher-cardinality levels down to a capped native categorical domain before the conditional tree learner sees them, and per_feature_ctr lets specific base features or categorical combinations opt into CTR generation without changing the underlying conditional split logic.

Model Persistence, Warm Start, And Cross-Validation

import ctboost

booster.save_model("regression-model.json")
restored = ctboost.load_model("regression-model.json")
restored_predictions = restored.predict(pool)
booster.export_model("standalone_predictor.py", export_format="python")

continued = ctboost.train(
    pool,
    {"objective": "RMSE", "learning_rate": 0.2, "max_depth": 2, "alpha": 1.0, "lambda_l2": 1.0},
    num_boost_round=10,
    init_model=restored,
)

cv_result = ctboost.cv(
    pool,
    {
        "objective": "RMSE",
        "learning_rate": 0.2,
        "max_depth": 2,
        "alpha": 1.0,
        "lambda_l2": 1.0,
    },
    num_boost_round=25,
    nfold=3,
)

The scikit-learn compatible estimators also expose:

  • save_model(...)
  • export_model(..., export_format="python") for standalone numeric or already-prepared deployment scoring
  • load_model(...)
  • staged_predict(...)
  • staged_predict_proba(...) for classifiers
  • predict_leaf_index(...)
  • predict_contrib(...)
  • evals_result_
  • best_score_
  • sample_weight on fit(...)
  • class_weight, scale_pos_weight, eval_metric, nan_mode, nan_mode_by_feature, and warm_start
  • max_bins, max_bin_by_feature, border_selection_method, and feature_borders
  • bagging_temperature, feature_weights, first_feature_use_penalties, random_strength, grow_policy, min_samples_split, and max_leaf_weight

Public Python API

The main entry points are:

  • ctboost.Pool
  • ctboost.FeaturePipeline
  • ctboost.PreparedTrainingData
  • ctboost.prepare_pool
  • ctboost.prepare_training_data
  • ctboost.train
  • ctboost.cv
  • ctboost.Booster
  • ctboost.CTBoostClassifier
  • ctboost.CTBoostRanker
  • ctboost.CTBoostRegressor
  • ctboost.CBoostClassifier
  • ctboost.CBoostRanker
  • ctboost.CBoostRegressor
  • ctboost.build_info
  • ctboost.load_model

Build and Test

Run the test suite:

pytest tests

The latest local release-candidate validation on April 13, 2026 was:

python -m pytest -q

Build an sdist:

python -m build --sdist

Configure and build the native extension directly with CMake:

python -m pip install pybind11 numpy pandas scikit-learn pytest
cmake -S . -B build -DCTBOOST_ENABLE_CUDA=OFF -Dpybind11_DIR="$(python -m pybind11 --cmakedir)"
cmake --build build --config Release --parallel

Wheel builds are configured through cibuildwheel for:

  • Windows amd64
  • Linux x86_64 and aarch64 using the current manylinux baseline
  • macOS x86_64
  • CPython 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, and 3.14

GitHub Actions workflows:

  • .github/workflows/cmake.yml: configures, builds, installs, and tests CPU builds on Ubuntu, Windows, and macOS for pushes and pull requests
  • .github/workflows/publish.yml: builds release wheels and the sdist, runs wheel smoke tests on built artifacts, publishes CPU wheels to PyPI, and attaches both CPU and Linux/Windows GPU wheels to tagged GitHub releases

The standard PyPI release wheel workflow builds CPU-only wheels by setting:

cmake.define.CTBOOST_ENABLE_CUDA=OFF

The GPU release-wheel matrices enable CUDA separately with:

cmake.define.CTBOOST_ENABLE_CUDA=ON
cmake.define.CTBOOST_REQUIRE_CUDA=ON
cmake.define.CMAKE_CUDA_COMPILER=/usr/local/cuda-12.0/bin/nvcc
cmake.define.CUDAToolkit_ROOT=/usr/local/cuda-12.0
cmake.define.CMAKE_CUDA_ARCHITECTURES=60;70;75;80;86;89
wheel.build-tag=1gpu

Project Layout

ctboost/      Python API layer
include/      public C++ headers
src/core/     core boosting, objectives, trees, statistics
src/bindings/ pybind11 extension bindings
cuda/         optional CUDA backend
tests/        Python test suite

License

Apache 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ctboost-0.1.38.tar.gz (309.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ctboost-0.1.38-cp314-cp314-win_amd64.whl (477.6 kB view details)

Uploaded CPython 3.14Windows x86-64

ctboost-0.1.38-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (662.1 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (596.4 kB view details)

Uploaded CPython 3.14manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp314-cp314-macosx_10_15_x86_64.whl (510.5 kB view details)

Uploaded CPython 3.14macOS 10.15+ x86-64

ctboost-0.1.38-cp313-cp313-win_amd64.whl (464.8 kB view details)

Uploaded CPython 3.13Windows x86-64

ctboost-0.1.38-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (661.6 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (595.1 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp313-cp313-macosx_10_15_x86_64.whl (510.2 kB view details)

Uploaded CPython 3.13macOS 10.15+ x86-64

ctboost-0.1.38-cp312-cp312-win_amd64.whl (464.8 kB view details)

Uploaded CPython 3.12Windows x86-64

ctboost-0.1.38-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (662.0 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (595.5 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp312-cp312-macosx_10_15_x86_64.whl (510.2 kB view details)

Uploaded CPython 3.12macOS 10.15+ x86-64

ctboost-0.1.38-cp311-cp311-win_amd64.whl (464.6 kB view details)

Uploaded CPython 3.11Windows x86-64

ctboost-0.1.38-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (658.5 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (593.2 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp311-cp311-macosx_10_15_x86_64.whl (508.2 kB view details)

Uploaded CPython 3.11macOS 10.15+ x86-64

ctboost-0.1.38-cp310-cp310-win_amd64.whl (463.6 kB view details)

Uploaded CPython 3.10Windows x86-64

ctboost-0.1.38-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (656.4 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (590.9 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp310-cp310-macosx_10_15_x86_64.whl (506.4 kB view details)

Uploaded CPython 3.10macOS 10.15+ x86-64

ctboost-0.1.38-cp39-cp39-win_amd64.whl (469.6 kB view details)

Uploaded CPython 3.9Windows x86-64

ctboost-0.1.38-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (655.1 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (589.7 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

ctboost-0.1.38-cp38-cp38-win_amd64.whl (463.1 kB view details)

Uploaded CPython 3.8Windows x86-64

ctboost-0.1.38-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (654.3 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ x86-64manylinux: glibc 2.28+ x86-64

ctboost-0.1.38-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (589.2 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.27+ ARM64manylinux: glibc 2.28+ ARM64

File details

Details for the file ctboost-0.1.38.tar.gz.

File metadata

  • Download URL: ctboost-0.1.38.tar.gz
  • Upload date:
  • Size: 309.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38.tar.gz
Algorithm Hash digest
SHA256 356c21f8c17ddeb2a7f2837b68659fdb6c583e5b534e83dd92ebbff57fb78cc8
MD5 fa545a9e6f726f57851568db8112ded7
BLAKE2b-256 7380796b33473be152a91ba9d8c15e9e9c6784246907406447e20e1a5c49c9d4

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38.tar.gz:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 477.6 kB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 a50c43be00100d9dace81d4101b8741ec29b034c83c769e746daaba61cee94af
MD5 3fb9414a6f77a359724d47c02e31f2e0
BLAKE2b-256 cafabd1cbab5152a5932ff5d539b8cd92256f38db96232b4824a5344e0457a37

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp314-cp314-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 955069b8e00eb13b35092f09d6d72d934cdf1ae4210d201aaa9ad46e678e609c
MD5 570368850473a17a2d51658a2c5d1bed
BLAKE2b-256 da12ea781a0caacfb4bcae5855b7ca771fa3e016328375aaf261d3acba8a1523

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 339aa9d328e490705971f1d3938ff59ea4a343984d282e7f3c2ad0c227e91960
MD5 3ee2c827dd26cf2a314c04ad72fa5c6f
BLAKE2b-256 a71fe762d3e4eeb340d0df8525279d5633ecfa14af60450a51e0742750a15a3a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp314-cp314-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp314-cp314-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 98af60ddf2fb6a28803ae06dc60f112d4789f712e414f4ecd62391715415930e
MD5 8652d8f74bdcf118a07b1a892d6bbec2
BLAKE2b-256 e0230608dc914ab0936f57e17fefecfaee0fd8939e65dd63bf1b87b8fe526952

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp314-cp314-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 464.8 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 1ae51633f6f8c1577685462b332f2da1b17258679b332652022496292095061e
MD5 be156cfd9509cb2e4eebddc03212b5ba
BLAKE2b-256 b70bac06f15e10ccdc49085e91246cbbe8450a23a55954bf81e2cce677cd02e1

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 b628dc397020f14ecef7dfdb08f2a8d9980067777d467fa31537f42d4e3c59ac
MD5 fde3b951e86ffde73693e1ffb276a508
BLAKE2b-256 aef3136181f1bed67d4da82aecfd17a9e2a817700ae2abe7f6a649cecc1dd7e5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7297940291b6275bfc57a23ae28cd64eadf180cf06b12a094dc43b8c46455e58
MD5 239c4a1f730b402ba1094f9439734233
BLAKE2b-256 7be4e3d6ec9a1cda695c55a2d6c740f0ed88b6f4c8b12c488d525132caf19ba5

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp313-cp313-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp313-cp313-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 4b35880e4b21d12a2731eb2a6dae378d4f5e44b364244178d621d985810df6ce
MD5 fc89995c00fadf247463cf1945795cdb
BLAKE2b-256 229045e2966f6019f56c456425d95b3fee8238b97990198e4252ba5c7a84c31d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp313-cp313-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 464.8 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 a49bee339b6cd620165c3ea0dd341481a91b213ce1ef729d7a8a9c4746e076ac
MD5 59ebaf23e5058f0478532a592cc4d596
BLAKE2b-256 4c72fbf4887a5bae7f2c994d0155981a1bd67ff964eb902739bfaa79b052ae97

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 192ae7e94b61f96990539e17d58d18dc74348b52d8db692a7e3474b086935218
MD5 abaf6c6a7ff0561df4b7886f536fb3fc
BLAKE2b-256 73f101ce082d8ff7af84eafc7a4bed40368e8da394f02adfe42b43a654f9aebb

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 30b1b5a8195757438d6e661937edf2eaf2785195c036da3955f4f4de79de19fc
MD5 0eb0ed2e9b29df20ae2c1b15517dc08c
BLAKE2b-256 21e925b103ac2cae5c7d249793e5de96ea23582530683b0c6fb909567c266137

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp312-cp312-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp312-cp312-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 d5385034ac5a7387823ed0d5a75714ba0de6c478d45b96f9e2517346ec10d71e
MD5 6b4ec30d38c65d24c30ebf2cae7e9b47
BLAKE2b-256 8a3704cf2a365cd03685c9aa11cfc273a2d2ac35ad2e716ae8a218438cc50083

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp312-cp312-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 464.6 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 1246c9a2896c4987ffb5da172492dc64be46b8a3d0e1d9602a5ca50f182387d9
MD5 4b2e8248442c3dbe395c4bd0e1ce0820
BLAKE2b-256 e4484a6172a9c4362c8881ee8d3ca620845f6a6680a3fdebe7763164e845b9ac

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a235a615bc39f8b71e2e4792e81744107f1f5c216c126d4847585d4c593568ff
MD5 7c21d91139e8f740d606dccc4f406136
BLAKE2b-256 428cad634940cd9aa249959053195cb2ab1f63b07f3e47b60cdaadbcac3924ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 309f12ffe16f8329cb2886d4153d3ae18e5aa63e5fcf78255cbd795652e46621
MD5 6213a81c26f401cab64d4b406536e2d7
BLAKE2b-256 897953bc9900a99d74a185eb6ab715fa8a9b6d9776d5e64034a006d31a87654e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp311-cp311-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp311-cp311-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 e96d6f95b37226479e87d9e193ebf09629880ab45411a7c2797559b7091c08c5
MD5 c901d6a8bd61e13fd58589a3bf75533f
BLAKE2b-256 31a02baff821093b7d29736812c010a51c69ef6657176e794152a565c1efd57f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp311-cp311-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 463.6 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 6b5fa0885fb32449de39ca2859afae0f10e23f230fd3d2fe11ff72dd7208c1a4
MD5 896f623cf2529092ebcc8e6f6fd3729f
BLAKE2b-256 5d5d47f7854fe7d0fd7da752e7709f8b1194d33b6cd4f12981602bab1f2efa50

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 957c086102ced7fe39a9f3111e8c5a47f823ae5f0910d459b29dc2983295bec0
MD5 d0b43d8dd2f19badbccc66f621e90846
BLAKE2b-256 7e4d575c29f2e81cbdfd018e96425818754f01bf24c84a3b7694ae57e1237ec0

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 3874b2d3993e93d9463f0e9ce3b3be1ef2958dd0ad46b8568c978e8000afaac4
MD5 5574c1b43a5067e2a62bbda0d0cac49b
BLAKE2b-256 04ac4bb37677aa73cf9b0b8e1937d86dd4b5b020ab5be8b967cf3eef800e66c2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp310-cp310-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp310-cp310-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 b55578247c536e17571754717b355c8bc6aa17c57e28c6a4b383f986ad4879f9
MD5 bb25fa9470df64e40d130420827b530a
BLAKE2b-256 b778fa9202a6824b8f0823b7f758869dca534063d3bd6309f2bb0711f4f2ac1b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp310-cp310-macosx_10_15_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 469.6 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 1558d794589e3c12b43cea646aabec848a456147119b5e360689ae8a84662d6f
MD5 07e59a7c15a24ebfee3bb122f0c4d06e
BLAKE2b-256 3405fb3afbcad86d8aeff099083ab50927be36688938e64403463ed19f203662

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp39-cp39-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 6f351a931194fcdc11894a9874c7f29697b10b0b4455241d97ac5bbcb54ea967
MD5 fe7708bf94adcfdeb0549602fd966127
BLAKE2b-256 66f17c64a4bb1deae06cfbb93983741544deed6120a2b8bbcd1cc7a107076fd4

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 66e8ebcc2983e612b03803f98c9be07c90b3548e9e82ef520a7256b8c46ac359
MD5 a46c6592805448866581cade9d8b7782
BLAKE2b-256 397bfee4336e3f79e4c3f61afa7dc2b8f1fbd045de4ac7c7b801cf8246e89668

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ctboost-0.1.38-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 463.1 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ctboost-0.1.38-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 fa91fd9731c3893ac01f9543daf172d0eb97bdd3961bb4fafebf3bfe44e955fb
MD5 b22776e3b48d9e21587d6353255f7df8
BLAKE2b-256 ca4f1940ad904666fb22775d38131fbc25fe59ab673801b4ad49ce71b4ac4e10

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp38-cp38-win_amd64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 eacf82eb83645d928f428b2a29f1d29e28aeb9336acaf2d2f65b5f82083c3f46
MD5 84922fe5eb009b2d6eaa5c2dd34143ee
BLAKE2b-256 79e786018bb5571313f46d14fade332a22296be9cee5ef69420f0f84f777e2c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp38-cp38-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ctboost-0.1.38-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ctboost-0.1.38-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 b8b0608ec8754b23810f580c5a7a0039bbe9b45a22ad91a0f3972fd5920e0ebc
MD5 1518c143c76b354e464518e3a9864c09
BLAKE2b-256 42c15cc60cf2b85afba20e5b652762359319235d08cea32f73db67d10824607d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ctboost-0.1.38-cp38-cp38-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl:

Publisher: publish.yml on captnmarkus/ctboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page