An implementation of Embarrassingly Shallow Autoencoders
Project description
EASEy
An implementation of Embarrassingly Shallow Autoencoders (EASE).
EASE is a state-of-the-art prediction model for collaborative filtering on implicit feedback.
When to use EASE and when not to use EASE
EASE consistently performs near the top of recommender system benchmarks (see live benchmark). It outperforms many deep learning and graph-based approaches (see paper).
EASE is best when the number of items is small, because the most computationally complex part of training is taking the inverse of an item x item cooccurrence matrix. The good news is, this complexity is independent of the number of users or interactions.
EASE also doesn't take into account any item or user features like more complex models - it uses interactions only.
Given these two constraints, EASE is a great tool for:
- Standalone prediction - Raw EASE scores are highly predictive
- Candidate generation - Limit the item space to a set of relevant candidates per user
- Feature engineering - EASE scores can be used in downstream models (e.g., a classification GBM)
Installation
EASEy depends on sparse_dot_mkl and numpy. sparse_dot_mkl is used for parallel computation of the gram matrix (X^TX), because the scipy implementation is single-threaded which becomes a bottleneck very quickly.
It is recommended to install sparse_dot_mkl with conda because this ensures that MKL is linked properly. If you use conda, you likely already have MKL installed because numpy is built with MKL by default.
Usage
EASEy is compatible with both pandas and polars DataFrames. Technically it's compatible with any object that has array-like values accessible with index [] syntax, even a basic dict. The EASE class has two public methods - fit and predict - for training and inference respectively.
EASE has only one hyperparameter, lambda, for L2 regularization. In the original paper, values from 200 to 1,000 were found to be optimal. Lower values lead to more long-tail recommendations at the expense of possible overfitting. Higher values lead to recommending more popular items.
See movielens_example.ipynb for a simple training and inference example.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file easey-0.2.0.tar.gz.
File metadata
- Download URL: easey-0.2.0.tar.gz
- Upload date:
- Size: 4.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6f0a1f7ef86bc49f2acaabfa367f730cbcce821cf1f90cb6775ad87443cb0e5a
|
|
| MD5 |
e48912da7aa70edd23fef4330d3e8dcd
|
|
| BLAKE2b-256 |
dfbe4478f1690c8e613be6f5b8e93161948060391ba7728f7891b1a4b388f523
|
File details
Details for the file easey-0.2.0-py3-none-any.whl.
File metadata
- Download URL: easey-0.2.0-py3-none-any.whl
- Upload date:
- Size: 4.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5867f72d6495e3a1e8fe8a67449c393f2f3392b8d12e6706b51b546aeb33ed07
|
|
| MD5 |
2159eeaf45e70fbc6b53692aa5beeb69
|
|
| BLAKE2b-256 |
98ca74653a435c5b8b898fc745c826c5ddd9c8b1657a22b164133674a209b431
|