A PyTorch Implementation Of Lattice Modeling Techniques
Project description
Getting Started with PyTorch Lattice
A PyTorch implementation of constrained optimization and modeling techniques
- Transparent Models: Glassbox models to provide increased interpretability and insights into your ML models.
- Shape Constraints: Embed domain knowledge directly into the model through feature constraints.
- Rate Constraints (Coming soon...): Optimize any PyTorch model under a set of constraints on rates (e.g. FPR < 1%). Rates can be calculated both for the entire dataset as well as specific slices.
Installation
Install PyTorch Lattice and start training and analyzing calibrated models in minutes.
$ pip install pytorch-lattice
Quickstart
Step 1. Import the package
First, import the PyTorch Lattice library:
import pytorch_lattice as pyl
Step 2. Load data and fit a classifier
Load the UCI Statlog (Heart) dataset. Then create a base classifier and fit it to the data. Creating the base classifier requires only the feature names.
X, y = pyl.datasets.heart()
clf = pyl.Classifier(X.columns).fit(X, y)
Step 3. Plot a feature calibrator
Now that you've trained a classifier, you can plot the feature calibrators to better understand how the model is understanding each feature.
pyl.plots.calibrator(clf.model, "thal")
Step 4. What's Next?
- Check out the Concepts section to dive deeper into the library and the core features that make it powerful, such as calibrators and shape constraints.
- You can follow along with more detailed walkthroughs to get a better understanding of how to utilize the library to effectively model your data. You can also take a look at code examples in the repo.
- The API Reference contains full details on all classes, methods, functions, etc.
Related Research
- Monotonic Kronecker-Factored Lattice, William Taylor Bakst, Nobuyuki Morioka, Erez Louidor, International Conference on Learning Representations (ICLR), 2021
- Multidimensional Shape Constraints, Maya Gupta, Erez Louidor, Oleksandr Mangylov, Nobu Morioka, Taman Narayan, Sen Zhao, Proceedings of the 37th International Conference on Machine Learning (PMLR), 2020
- Deontological Ethics By Monotonicity Shape Constraints, Serena Wang, Maya Gupta, International Conference on Artificial Intelligence and Statistics (AISTATS), 2020
- Shape Constraints for Set Functions, Andrew Cotter, Maya Gupta, H. Jiang, Erez Louidor, Jim Muller, Taman Narayan, Serena Wang, Tao Zhu. International Conference on Machine Learning (ICML), 2019
- Diminishing Returns Shape Constraints for Interpretability and Regularization, Maya Gupta, Dara Bahri, Andrew Cotter, Kevin Canini, Advances in Neural Information Processing Systems (NeurIPS), 2018
- Deep Lattice Networks and Partial Monotonic Functions, Seungil You, Kevin Canini, David Ding, Jan Pfeifer, Maya R. Gupta, Advances in Neural Information Processing Systems (NeurIPS), 2017
- Fast and Flexible Monotonic Functions with Ensembles of Lattices, Mahdi Milani Fard, Kevin Canini, Andrew Cotter, Jan Pfeifer, Maya Gupta, Advances in Neural Information Processing Systems (NeurIPS), 2016
- Monotonic Calibrated Interpolated Look-Up Tables, Maya Gupta, Andrew Cotter, Jan Pfeifer, Konstantin Voevodski, Kevin Canini, Alexander Mangylov, Wojciech Moczydlowski, Alexander van Esbroeck, Journal of Machine Learning Research (JMLR), 2016
- Optimized Regression for Efficient Function Evaluation, Eric Garcia, Raman Arora, Maya R. Gupta, IEEE Transactions on Image Processing, 2012
- Lattice Regression, Eric Garcia, Maya Gupta, Advances in Neural Information Processing Systems (NeurIPS), 2009
Contributing
PyTorch Lattice welcomes contributions from the community! See the contribution guide for more information on the development workflow. For bugs and feature requests, visit our GitHub Issues and check out our templates.
How To Help
Any and all help is greatly appreciated! Check out our page on how you can help.
Roadmap
Check out the our roadmap to see what's planned. If there's an item that you really want that isn't assigned or in progress, take a stab at it!
Versioning
PyTorch Lattice uses Semantic Versioning.
License
This project is licensed under the terms of the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pytorch_lattice-0.2.0.tar.gz
.
File metadata
- Download URL: pytorch_lattice-0.2.0.tar.gz
- Upload date:
- Size: 33.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1019-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ddbc9c8fdf76bf35ddd3037ac506aef4e431608e8d52110b363eb44a7bb79bf4 |
|
MD5 | b9976362d1054b480017c760a5f30f1a |
|
BLAKE2b-256 | 0c071ed72d8efd337564d5fa17f1df4ec8c61dbdc17ae753b921121e86cf64d8 |
File details
Details for the file pytorch_lattice-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: pytorch_lattice-0.2.0-py3-none-any.whl
- Upload date:
- Size: 42.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Linux/6.2.0-1019-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 50bcfed886fc1e1f9e610f3b4de2087b84e11e5698333e4c075189a5ea98952e |
|
MD5 | 93bc3a2e1862cca00833856f552219a6 |
|
BLAKE2b-256 | dd2b202d0f580c35b98d7d8c8c32a10239b1dce8d53807b15fb007f517c07ea8 |