Skip to main content

Unifying Academic Rigor and Industrial Scale for Responsible, Reproducible, and Efficient Recommendation

Project description

🚀 WarpRec

GitHub release (latest by date) License: MIT Python 3.12 Documentation Status PyTorch Ruff CodeCarbon MCP Powered GitHub Stars

Read the Docs

WarpRec is a flexible and efficient framework designed for building, training, and evaluating recommendation models. It supports a wide range of configurations, customizable pipelines, and powerful optimization tools to enhance model performance and usability.

WarpRec is designed for both beginners and experienced practitioners. For newcomers, it offers a simple and intuitive interface to explore and experiment with state-of-the-art recommendation models. For advanced users, WarpRec provides a modular and extensible architecture that allows rapid prototyping, complex experiment design, and fine-grained control over every step of the recommendation pipeline.

Whether you're learning how recommender systems work or conducting high-performance research and development, WarpRec offers the right tools to match your workflow.

🏗️ Architecture

WarpRec Architecture

WarpRec is built on 4 foundational pillars — Scalability, Green AI, Agentic Readiness, and Scientific Rigor — and organized into 5 modular engines that manage the end-to-end recommendation lifecycle:

  1. Reader — Ingests user-item interactions and metadata from local or cloud storage via a backend-agnostic Narwhals abstraction layer.
  2. Data Engine — Applies configurable filtering and splitting strategies to produce clean, leak-free train/validation/test sets.
  3. Recommendation Engine — Trains and optimizes models using PyTorch, with seamless scaling from single-GPU to multi-node Ray clusters.
  4. Evaluation Engine — Computes 40 GPU-accelerated metrics in a single pass with automated statistical significance testing.
  5. Writer — Serializes results, checkpoints, and carbon reports to local or cloud storage.

An Application Layer exposes trained models through a REST API (FastAPI) and an MCP server for agentic AI workflows.

📚 Table of Contents

✨ Key Features

  • 55 Built-in Algorithms: WarpRec ships with 55 state-of-the-art recommendation models spanning 6 paradigms — Unpersonalized, Content-Based, Collaborative Filtering (e.g., LightGCN, EASE$^R$, MultiVAE), Context-Aware (e.g., DeepFM, xDeepFM), Sequential (e.g., SASRec, BERT4Rec, GRU4Rec), and Hybrid. All models are fully configurable and extend a standardized base class, making it easy to prototype custom architectures within the same pipeline.
  • Backend-Agnostic Data Engine: Built on Narwhals, WarpRec operates over Pandas, Polars, and Spark without code changes — enabling a true "write-once, run-anywhere" workflow from laptop to distributed cluster. Data ingestion supports both local filesystems and cloud object storage (Azure Blob Storage).
  • Comprehensive Data Processing: The data module provides 13 filtering strategies (filter-by-rating, k-core, cold-start heuristics) and 6 splitting protocols (random/temporal Hold-Out, Leave-k-Out, Fixed Timestamp, k-fold Cross-Validation), for a total of 19 configurable strategies to ensure rigorous and reproducible experimental setups.
  • 40 GPU-Accelerated Metrics: The evaluation suite covers 40 metrics across 7 families — Accuracy, Rating, Coverage, Novelty, Diversity, Bias, and Fairness — including multi-objective metrics for simultaneous optimization of competing goals. All metrics are computed with full GPU acceleration for large-scale experiments.
  • Statistical Rigor: WarpRec automates hypothesis testing with paired (Student's t-test, Wilcoxon signed-rank) and independent-group (Mann-Whitney U) tests, and applies multiple comparison corrections via Bonferroni and FDR (Benjamini-Hochberg) to prevent p-hacking and ensure statistically robust conclusions.
  • Distributed Training & HPO: Seamless vertical and horizontal scaling from single-GPU to multi-node Ray clusters. Hyperparameter optimization supports Grid, Random, Bayesian, HyperOpt, Optuna, and BoHB strategies, with ASHA pruning and model-level early stopping to maximize computational efficiency.
  • Green AI & Carbon Tracking: WarpRec is the first recommendation framework with native CodeCarbon integration, automatically quantifying energy consumption and CO₂ emissions for every experiment and persisting carbon footprint reports alongside standard results.
  • Agentic AI via MCP: WarpRec natively implements a Model Context Protocol server (infer-api/mcp_server.py), exposing trained recommenders as callable tools within LLM and autonomous agent workflows — transforming the framework from a static predictor into an interactive, agent-ready component.
  • REST API & Model Serving: Trained models are instantly deployable as RESTful microservices via the built-in FastAPI server (infer-api/server.py), decoupling the modeling core from serving infrastructure with zero additional engineering effort.
  • Experiment Tracking: Native integrations with TensorBoard, Weights & Biases, and MLflow for real-time monitoring of metrics, training dynamics, and multi-run management.
  • Custom Pipelines & Callbacks: Beyond the three standard pipelines (Training, Design, Evaluation), WarpRec exposes an event-driven Callback system for injecting custom logic at any stage — enabling complex experiments without modifying framework internals.

⚙️ Installation

WarpRec is designed to be easily installed and reproducible using Conda. This ensures that all dependencies and the Python environment are managed consistently. Environment is available both for CPU e GPU.

📋 Prerequisites

  • Git: To clone the repository.
  • Conda: You need either Anaconda or Miniconda installed on your system.

🛠️ Setup Guide

Follow these steps to clone the project and set up the environment:

  1. Clone the repository Open your terminal and clone the WarpRec repository:

    git clone <repository_url>
    cd warprec
    
  2. Create the Conda environment Use the provided environment.gpu.yml (or environment.cpu.yml) file to create the virtual environment. This will install Python 3.12 and the necessary core dependencies.

    conda env create --file environment.gpu.yml
    
  3. Activate the environment:

    conda activate warprec
    

🚂 Usage

🏋️‍♂️ Training a model

To train a model, use the train pipeline. Here's an example:

  1. Prepare a configuration file (e.g. config/train_config.yml) with details about the model, dataset and training parameters.
  2. Start a Ray HEAD node:
    ray start --head
    
  3. Run the following command:
    python -m warprec.run -c config/train_config.yml -p train
    

This command starts the training process using the specified configuration file.

✏️ Design a model

To implement a custom model, WarpRec provides a dedicated design interface via the design pipeline. The recommended workflow is as follows:

  1. Prepare a configuration file (e.g. config/design_config.yml) with details about the custom models, dataset and training parameters.
  2. Run the following command:
    python -m warprec.run -c config/design_config.yml -p design
    

This command initializes a lightweight training pipeline, specifically intended for rapid prototyping and debugging of custom architectures within the framework.

🔍 Evaluate a model

To run only evaluation on a model, use the eval pipeline. Here's an example:

  1. Prepare a configuration file (e.g. config/eval_config.yml) with details about the model, dataset and training parameters.
  2. Run the following command:
    python -m warprec.run -c config/eval_config.yml -p eval
    

This command starts the evaluation process using the specified configuration file.

🧰 Makefile Commands

The project includes a Makefile to simplify common operations:

  • 🧹 Run linting:
    make lint
    
  • 🧑‍🔬 Run tests:
    make test
    

🤝 Contributing

We welcome contributions from the community! Whether you're fixing bugs, improving documentation, or proposing new features, your input is highly valued.

To get started:

  1. Fork the repository and create a new branch for your feature or fix.
  2. Follow the existing coding style and conventions.
  3. Make sure the code passes all checks by running make lint.
  4. Open a pull request with a clear description of your changes.

If you encounter any issues or have questions, feel free to open an issue in the Issues section of the repository.

📜 License

This project is licensed under the MIT License - see the LICENSE file for details.

📖 Citation

Citation details will be provided in an upcoming release. Stay tuned!

📧 Contact

For questions or suggestions, feel free to contact us at:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

warprec-1.0.0.tar.gz (218.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

warprec-1.0.0-py3-none-any.whl (383.8 kB view details)

Uploaded Python 3

File details

Details for the file warprec-1.0.0.tar.gz.

File metadata

  • Download URL: warprec-1.0.0.tar.gz
  • Upload date:
  • Size: 218.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for warprec-1.0.0.tar.gz
Algorithm Hash digest
SHA256 ac92782ff947bafc0b47f8d22a83dc6ebf8bdbb8c89c034ec2cff91c42157e4b
MD5 4ad4e3c8dd9c2d064018bfdfa18357e9
BLAKE2b-256 22765127de73fee0103ec8818a05683b4b0c490bacf8b33b5d5da08ca4cf1892

See more details on using hashes here.

Provenance

The following attestation bundles were made for warprec-1.0.0.tar.gz:

Publisher: release.yml on sisinflab/warprec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file warprec-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: warprec-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 383.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for warprec-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c7a1cb0a6c051b72c9d312282d8df12f1805a8d773c3282ea29bf9f637f49ccd
MD5 c6ab38d2ad19ac6a9963b5a14ec9917c
BLAKE2b-256 ed33c3aa327fb212d12d8755a3e8163952f8e40fe3589a7943489b42cb0a3783

See more details on using hashes here.

Provenance

The following attestation bundles were made for warprec-1.0.0-py3-none-any.whl:

Publisher: release.yml on sisinflab/warprec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page