Skip to main content

Implementations of various interpretable models

Project description

Interpretable machine-learning models (imodels) ๐Ÿ”

Python package for concise, transparent, and accurate predictive modeling. All sklearn-compatible and easy to use.

docs โ€ข imodels overview โ€ข demo notebooks

imodels overview

Modern machine-learning models are increasingly complex, often making them difficult to interpret. This package provides a simple interface for fitting and using state-of-the-art interpretable models, all compatible with scikit-learn. These models can often replace black-box models (e.g. random forests) with simpler models (e.g. rule lists) while improving interpretability and computational efficiency, all without sacrificing predictive accuracy! Simply import a classifier or regressor and use the fit and predict methods, same as standard scikit-learn models.

from imodels import BoostedRulesClassifier, BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier # see more models below
from imodels import SLIMRegressor, RuleFitRegressor

model = BoostedRulesClassifier()  # initialize a model
model.fit(X_train, y_train)   # fit model
preds = model.predict(X_test) # discrete predictions: shape is (n_test, 1)
preds_proba = model.predict_proba(X_test) # predicted probabilities: shape is (n_test, n_classes)
print(model) # print the rule-based model

-----------------------------
# the model consists of the following 3 rules
# if X1 > 5: then 80.5% risk
# else if X2 > 5: then 40% risk
# else: 10% risk

Installation

Install with pip install imodels (see here for help).

Supported models

Model Reference Description
Rulefit rule set ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Extracts rules from decision trees then fits a sparse linear model with them
Skope rule set ๐Ÿ—‚๏ธ, ๐Ÿ”— Extracts rules from gradient-boosted trees, deduplicates them, then forms a linear combination of them based on their OOB precision
Boosted rule set ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Sequentially learns a set of rules with Adaboost
Slipper rule set ๐Ÿ—‚๏ธ, ๐Ÿ“„ Sequentially learns a set of rules with SLIPPER
Bayesian rule set ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Finds concise rule set with Bayesian sampling (slow)
Optimal rule list ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Learns succinct rule list using global optimization for sparsity (CORELS)
Bayesian rule list ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Learns compact rule list distribution with Bayesian sampling (slow)
Greedy rule list ๐Ÿ—‚๏ธ, ๐Ÿ”— Uses CART to learn a list (only a single path), rather than a decision tree
OneR rule list ๐Ÿ—‚๏ธ, ๐Ÿ“„ Learns rule list restricted to only one feature
Optimal rule tree ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Learns succinct tree using global optimization for sparsity (GOSDT)
Greedy rule tree ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Greedily learns tree using CART
Iterative random forest ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ (In progress) Repeatedly fit random forest, giving features with high importance a higher chance of being selected
Sparse integer linear model ๐Ÿ—‚๏ธ, ๐Ÿ“„ Sparse linear model with integer coefficients
More models โŒ› (Coming soon!) Popular rule sets including Lightweight Rule Induction, MLRules

Docs ๐Ÿ—‚๏ธ, Reference code implementation ๐Ÿ”—, Research paper ๐Ÿ“„

Also see our simple function for explaining classification errors. Fit an interpretable model to explain a previous model's errors (ex. in this notebook๐Ÿ““).
Also see our fast and effective discretizers for data preprocessing.
Discretizer Reference Description
MDLP ๐Ÿ—‚๏ธ, ๐Ÿ”—, ๐Ÿ“„ Discretize using entropy minimization heuristic
Simple ๐Ÿ—‚๏ธ, ๐Ÿ”— Simple KBins discretization
Random Forest ๐Ÿ—‚๏ธ Discretize into bins based on random forest split popularity

The final form of the above models takes one of the following forms, which aim to be simultaneously simple to understand and highly predictive:

Rule set Rule list Rule tree Algebraic models

Different models and algorithms vary not only in their final form but also in different choices made during modeling. In particular, many models differ in the 3 steps given by the table below.

ex. RuleFit and SkopeRules RuleFit and SkopeRules differ only in the way they prune rules: RuleFit uses a linear model whereas SkopeRules heuristically deduplicates rules sharing overlap.
ex. Bayesian rule lists and greedy rule lists Bayesian rule lists and greedy rule lists differ in how they select rules; bayesian rule lists perform a global optimization over possible rule lists while Greedy rule lists pick splits sequentially to maximize a given criterion.
ex. FPSkope and SkopeRules FPSkope and SkopeRules differ only in the way they generate candidate rules: FPSkope uses FPgrowth whereas SkopeRules extracts rules from decision trees.

See the docs for individual models for futher descriptions.

Rule candidate generation Rule selection Rule pruning / combination

The code here contains many useful and customizable functions for rule-based learning in the util folder. This includes functions / classes for rule deduplication, rule screening, and converting between trees, rulesets, and neural networks.

Demo notebooks

Demos are contained in the notebooks folder.

imodels demo Shows how to fit, predict, and visualize with different interpretable models
imodels colab demo Shows how to fit, predict, and visualize with different interpretable models
clinical decision rule notebook Shows an example of using imodels for deriving a clinical decision rule
posthoc analysis We also include some demos of posthoc analysis, which occurs after fitting models: posthoc.ipynb shows different simple analyses to interpret a trained model and uncertainty.ipynb contains basic code to get uncertainty estimates for a model

Support for different tasks

Different models support different machine-learning tasks. Current support for different models is given below (each of these models can be imported directly from imodels (e.g. from imodels import RuleFitClassifier):

Model Binary classification Regression
Rulefit rule set RuleFitClassifier RuleFitRegressor
Skope rule set SkopeRulesClassifier
Boosted rule set BoostedRulesClassifier
SLIPPER rule set SlipperClassifier
BOA rule set BayesianRuleSetClassifier
Optimal rule list (CORELS) OptimalRuleListClassifier
Bayesian rule list BayesianRuleListClassifier
Greedy rule list GreedyRuleListClassifier
OneR rule list OneRClassifier
Optimal rule tree (GOSDT) OptimalTreeClassifier
Greedy rule tree GreedyTreeClassifier
Iterative random forest
Sparse integer linear model SLIMClassifier SLIMRegressor

References

Readings
  • Interpretable ML good quick overview: murdoch et al. 2019, pdf
  • Interpretable ML book: molnar 2019, pdf
  • Case for interpretable models rather than post-hoc explanation: rudin 2019, pdf
  • Review on evaluating interpretability: doshi-velez & kim 2017, pdf
Reference implementations (also linked above) The code here heavily derives from the wonderful work of previous projects. We seek to to extract out, unify, and maintain key parts of these projects.
Related packages
Updates
  • For updates, star the repo, see this related repo, or follow @csinva_
  • Please make sure to give authors of original methods / base implementations appropriate credit!
  • Contributing: pull requests very welcome!

If it's useful for you, please star/cite the package, and make sure to give authors of original methods / base implementations credit:

@software{
    imodels2021,
    title        = {{imodels: a python package for fitting interpretable models}},
    journal      = {Journal of Open Source Software}
    publisher    = {The Open Journal},
    year         = {2021},
    author       = {Singh, Chandan and Nasseri, Keyan and Tan, Yan Shuo and Tang, Tiffany and Yu, Bin},
    volume       = {6},
    number       = {61},
    pages        = {3192},
    doi          = {10.21105/joss.03192},
    url          = {https://doi.org/10.21105/joss.03192},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

imodels-1.1.2.tar.gz (16.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

imodels-1.1.2-py3-none-any.whl (102.3 kB view details)

Uploaded Python 3

File details

Details for the file imodels-1.1.2.tar.gz.

File metadata

  • Download URL: imodels-1.1.2.tar.gz
  • Upload date:
  • Size: 16.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.0

File hashes

Hashes for imodels-1.1.2.tar.gz
Algorithm Hash digest
SHA256 0a87b0167a800e0a3a12d58c904c5e5a2134f70905853c1b4d20391d1c9e9855
MD5 1465323c5b05bbcfe5c4d24209a3722c
BLAKE2b-256 31c6afc94be8b3a3443fcfa1ccd92adb8966c2ad5afa982b51bd9ef48bac0bf1

See more details on using hashes here.

File details

Details for the file imodels-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: imodels-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 102.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.0

File hashes

Hashes for imodels-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c4814c046c9befebb9199eef0fb4958599d204cd19ecd24b899dbda708272519
MD5 9bc7acb97d65ba88237700aeb50e7bc0
BLAKE2b-256 795d25d76c08d2cd97121bc4b226ad325f0216a067085d0aaf2048b7ccbf7e04

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page