Skip to main content

A minimal implementation of Embarrassingly Shallow Autoencoders

Project description

EASEy

An implementation of Embarrassingly Shallow Autoencoders (EASE).

EASE is a state-of-the-art prediction model for collaborative filtering on implicit feedback.

When to use EASE and when not to use EASE

EASE consistently performs near the top of recommender system benchmarks (see live benchmark). It outperforms many deep learning and graph-based approaches (see paper).

EASE is best when the number of items is small, because the most computationally complex part of training is taking the inverse of an item x item cooccurrence matrix. The good news is, this complexity is independent of the number of users or interactions.

EASE also doesn't take into account any item or user features like more complex models - it uses interactions only.

Given these two constraints, EASE is a great tool for:

  • Standalone prediction - Raw EASE scores are highly predictive
  • Candidate generation - Limit the item space to a set of relevant candidates per user
  • Feature engineering - EASE scores can be used in downstream models (e.g., a classification GBM)

Installation

EASEy depends on sparse_dot_mkl and numpy. sparse_dot_mkl is used for parallel computation of the gram matrix (X^TX), because the scipy implementation is single-threaded which becomes a bottleneck very quickly.

It is recommended to install sparse_dot_mkl with conda because this ensures that MKL is linked properly. If you use conda, you likely already have MKL installed because numpy is built with MKL by default.

Usage

EASEy is compatible with both pandas and polars DataFrames. Technically it's compatible with any object that has array-like values accessible with index [] syntax, even a basic dict. The EASE class has two public methods - fit and predict - for training and inference respectively.

EASE has only one hyperparameter, lambda, for L2 regularization. In the original paper, values from 200 to 1,000 were found to be optimal. Lower values lead to more long-tail recommendations at the expense of possible overfitting. Higher values lead to recommending more popular items.

See movielens_example.ipynb for a simple training and inference example.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easey-0.1.0.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

easey-0.1.0-py3-none-any.whl (4.3 kB view details)

Uploaded Python 3

File details

Details for the file easey-0.1.0.tar.gz.

File metadata

  • Download URL: easey-0.1.0.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for easey-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c44ca679208aa4b3a69c06ea3295d4998411b4207af784c1c02db08022e5d506
MD5 871d4e143636868b0b184db54acf167d
BLAKE2b-256 eb2675b1c74269250978cd4fe04f9e8904b748586d18b70922dcedb4b7fd5358

See more details on using hashes here.

File details

Details for the file easey-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: easey-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 4.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for easey-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a84e198e264ca5342f309eefe95dc1d32be8a79843bd99f6303c02ce64b0fcdc
MD5 b58be14d4735f653f2dd81398a96f52c
BLAKE2b-256 4a928ae6d4ff379db6e0b68fade7d14449a70661327735cddcdd4e3456fdb23e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page