Skip to main content

essential data transformers and model estimators for ML and data science competitions

Project description

Centimators

Centimators: essential data transformers and model estimators for ML and data science competitions

centimators is an open-source python library built on scikit-learn, keras, and narwhals: designed for building and sharing dataframe-agnostic (pandas/polars), multi-framework (jax/tf/pytorch), sklearn-style (fit/transform/predict) transformers, meta-estimators, and machine learning models for data science competitions like Numerai, Kaggle, and the CrowdCent Challenge.

centimators makes heavy use of advanced scikit-learn concepts such as metadata routing. Familiarity with these concepts is recommended for optimal use of the library. You can learn more about metadata routing in the scikit-learn documentation.

Documentation is available at https://crowdcent.github.io/centimators/.

Installation

Recommended (using uv):

uv add centimators

Or, using pip:

pip install centimators

Quick Start

centimators transformers and estimators are dataframe-agnostic, powered by narwhals. You can use the same transformer seamlessly with both Pandas and Polars DataFrames. Here's an example with RankTransformer, which calculates the normalized rank of features for all tickers over time by date.

First, let's define some common data:

import pandas as pd
import polars as pl
# Create sample OHLCV data for two stocks over four trading days
data = {
    'date': ['2021-01-01', '2021-01-01', '2021-01-02', '2021-01-02', 
             '2021-01-03', '2021-01-03', '2021-01-04', '2021-01-04'],
    'ticker': ['AAPL', 'MSFT', 'AAPL', 'MSFT', 'AAPL', 'MSFT', 'AAPL', 'MSFT'],
    'open': [150.0, 280.0, 151.0, 282.0, 152.0, 283.0, 153.0, 284.0],    # Opening prices
    'high': [152.0, 282.0, 153.0, 284.0, 154.0, 285.0, 155.0, 286.0],    # Daily highs
    'low': [149.0, 278.0, 150.0, 280.0, 151.0, 281.0, 152.0, 282.0],     # Daily lows
    'close': [151.0, 281.0, 152.0, 283.0, 153.0, 284.0, 154.0, 285.0],   # Closing prices
    'volume': [1000000, 800000, 1200000, 900000, 1100000, 850000, 1050000, 820000]  # Trading volume
}

# Create both Pandas and Polars DataFrames
df_pd = pd.DataFrame(data)
df_pl = pl.DataFrame(data)

# Define the OHLCV features we want to transform
feature_cols = ['volume', 'close']

Now, let's use the transformer:

from centimators.feature_transformers import RankTransformer

transformer = RankTransformer(feature_names=feature_cols)
result_pd = transformer.fit_transform(df_pd[feature_cols], date_series=df_pd['date'])
result_pl = transformer.fit_transform(df_pl[feature_cols], date_series=df_pl['date'])

Both result_pd (from Pandas) and result_pl (from Polars) will contain the same transformed data in their native DataFrame formats. You may find significant performance gains using Polars for certain operations.

Advanced Pipeline

centimators transformers are designed to work seamlessly within scikit-learn Pipelines, leveraging its metadata routing capabilities. This allows you to pass data like date or ticker series through the pipeline to the specific transformers that need them, while also chaining together multiple transformers. This is useful for building more complex feature pipelines, but also allows for better cross-validation, hyperparameter tuning, and model selection. For example, if you add a Regressor at the end of the pipeline, you can imagine searching over various combinations of lags, moving average windows, and model hyperparameters during the training process.

output_chart

from sklearn import set_config
from sklearn.pipeline import make_pipeline
from centimators.feature_transformers import (
    LogReturnTransformer,
    RankTransformer,
    LagTransformer,
    MovingAverageTransformer
)

# Enable metadata routing globally
set_config(enable_metadata_routing=True)

# Define individual transformers with their parameters
log_return_transformer = LogReturnTransformer().set_transform_request(
    ticker_series=True
)
ranker = RankTransformer().set_transform_request(date_series=True)
lag_windows = [0, 5, 10, 15]
lagger = LagTransformer(windows=lag_windows).set_transform_request(
    ticker_series=True
)
ma_windows = [5, 10, 20, 40]
ma_transformer = MovingAverageTransformer(
    windows=ma_windows
).set_transform_request(ticker_series=True)

# Create the pipeline
feature_pipeline = make_pipeline(
    log_return_transformer, ranker, lagger, ma_transformer
)

centimators_pipeline

Explanation:

  • set_config(enable_metadata_routing=True) turns on scikit-learn's metadata routing.
  • set_transform_request(metadata_name=True) on each transformer tells the pipeline that this transformer expects metadata_name (e.g., date_series).
  • When pipeline.fit_transform(X, date_series=dates, ticker_series=tickers) is called:
    • The date_series is automatically passed to RankTransformer.
    • The ticker_series is automatically passed to LagTransformer, MovingAverageTransformer, and LogReturnTransformer.
    • The output of LogReturnTransformer is passed to RankTransformer
    • The output of RankTransformer is passed to LagTransformer
    • The output of LagTransformer is passed to MovingAverageTransformer

This allows for complex data transformations where different steps require different auxiliary information, all managed cleanly by the pipeline.

# Now you can use this pipeline with your data
feature_names = ['open', 'high', 'low', 'close']
transformed_df = feature_pipeline.fit_transform(
    df_polars[feature_names],
    date_series=df_polars["date"],
    ticker_series=df_polars["ticker"],
)

We can take a closer look at a sample output for a single ticker and for a single initial feature. This clearly shows how the close price for a cross-sectional dataset is transformed into a log return, ranked (between 0 and 1) by date, and smoothed (moving average windows) by ticker: feature_example

End-to-End Pipeline with an Estimator

The previous "Advanced Pipeline" example constructed only the feature engineering part of a workflow. Thanks to Centimators' Keras-backed estimators you can seamlessly append a model as the final step and train everything through a single fit call.

from sklearn.impute import SimpleImputer
from centimators.model_estimators import MLPRegressor


lag_windows = [0, 5, 10, 15]
ma_windows = [5, 10, 20, 40]

mlp_pipeline = make_pipeline(
    # Start with the existing feature pipeline
    feature_pipeline,
    # Replace NaNs created by lagging with a constant value
    SimpleImputer(strategy="constant", fill_value=0.5).set_output(transform="pandas"),
    # Train a neural network in-place
    MLPRegressor().set_fit_request(epochs=True),
)

feature_names = ["open", "high", "low", "close"]

mlp_pipeline.fit(
    df_pl[feature_names],
    df_pl["target"],
    date_series=df_pl["date"],
    ticker_series=df_pl["ticker"],
    epochs=5,
)

centimators_pipeline_estimator

Just as before, scikit-learn's metadata routing ensures that auxiliary inputs (date_series, ticker_series, epochs) are forwarded only to the steps that explicitly requested them.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

centimators-0.1.5.tar.gz (163.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

centimators-0.1.5-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file centimators-0.1.5.tar.gz.

File metadata

  • Download URL: centimators-0.1.5.tar.gz
  • Upload date:
  • Size: 163.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.4.25

File hashes

Hashes for centimators-0.1.5.tar.gz
Algorithm Hash digest
SHA256 be4d805a9ba06af182069e8107454567cc5465c045e78a1eaf6e94d6eb207bfe
MD5 6e8dd06cf260220424fb894cce3825e8
BLAKE2b-256 6c44a3d6109ec4dda4cb9f03203873e850786bbac4d04a06a1d1a7c8966caa5a

See more details on using hashes here.

File details

Details for the file centimators-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for centimators-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 67a602065cfa1db536189e46c43b7319be8d644508f6d2baf786885fcb081071
MD5 cf0217dea5a909679548f6921b359941
BLAKE2b-256 04a9f4c66b7bf0b8c8939ec3a64d57af52840b43b368ce15dfaa55203fe967e1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page