Skip to main content

Hippo is a lightweight machine-learning framework that unifies declarative preprocessing pipelines, automatic hyperparameter optimisation, and local experiment tracking behind a single, friendly CLI.

Project description

hippo

Lightweight ML workflow engine: declarative preprocessing ▸ hyper-parameter optimisation ▸ experiment tracking

hippo logo

hippo brings together three everyday ML chores under one friendly roof:

What you need How hippo helps
Clean, consistent data Build declarative preprocessing pipelines with Sklearn primitives
Good hyper-parameters without the grind Run Optuna-powered Bayesian optimisation (TPE, CMA-ES, …)
Remember what actually worked Log parameters & metrics to a local SQLite DB with zero setup

Why hippo?

  • Efficient – sensible defaults, parallel Optuna trials
  • 🛡️ Reliable – minimal dependencies, 100 % typed codebase
  • 🔌 Extensible – register your own models or transformations in a single line
  • 🐚 Simple CLIhippo train config.yml is all you need

Installation

pip install hippo

Quick Start

from sklearn.datasets import load_breast_cancer
from hippo.data.preprocessing import build_pipeline
from hippo.models import get_model
from hippo.tuning import optimise
from optuna import Trial

# 1  Load toy data
data = load_breast_cancer(as_frame=True)
X, y = data.data, data.target

# 2  Preprocessing pipeline
pipe = build_pipeline(num_feats=X.columns.tolist(), cat_feats=[])

# 3  Objective builder for Optuna
def make_objective(trial: Trial):
    n_estimators = trial.suggest_int("n_estimators", 50, 400)
    max_depth = trial.suggest_int("max_depth", 3, 10)

    model = get_model("rf", n_estimators=n_estimators, max_depth=max_depth, random_state=0)

    def objective(trial_: Trial):
        from sklearn.model_selection import cross_val_score
        score = cross_val_score(model, pipe.fit_transform(X), y, cv=3).mean()
        return score
    return objective

study = optimise(make_objective, n_trials=50, direction="maximize")
print("Best accuracy:", study.best_value)

Or the same via CLI:

python -m hippo.cli train config.yml

config.yml example:

run_name: demo_rf
data:
  path: cancer.joblib         # joblib-dumped DataFrame
  target: target
  numerical: [mean radius, mean area, ...]
  categorical: []
model:
  name: rf
  params:
    n_estimators: 200

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hippo-0.1.0b2.tar.gz (5.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hippo-0.1.0b2-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file hippo-0.1.0b2.tar.gz.

File metadata

  • Download URL: hippo-0.1.0b2.tar.gz
  • Upload date:
  • Size: 5.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.3 Darwin/24.5.0

File hashes

Hashes for hippo-0.1.0b2.tar.gz
Algorithm Hash digest
SHA256 93a8aa2944b5615fd30478f22a4bb6969168dfffbcea2ff3a414fc007d345310
MD5 1106f30526378a3a76e796bb26472c85
BLAKE2b-256 24444e44479344a8b917409d3b174f7be257ce7c3140beb812c0d7cda9cf4fff

See more details on using hashes here.

File details

Details for the file hippo-0.1.0b2-py3-none-any.whl.

File metadata

  • Download URL: hippo-0.1.0b2-py3-none-any.whl
  • Upload date:
  • Size: 8.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.3 Darwin/24.5.0

File hashes

Hashes for hippo-0.1.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 376714bcd67161e0b763041081260465396b344eea9a5c6221dbce7af1070fcb
MD5 27dc8039a467c3acb5f8141fd2769746
BLAKE2b-256 18fe8cf85adadbb0aa1b8e7b65cb687c6e9d1ee0cd3eac0a3f52166e8dde338a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page