Global Explanations for Deep Neural Networks
Reason this release was yanked:
Unresolved Dependency Issues
Project description
GAM (Global Attribution Mapping)
Global Explanations for Deep Neural Networks
GAM explains the landscape of neural network predictions across subpopulations.
This implementation is based on "Global Explanations for Neural Networks: Mapping the Landscape of Predictions" (AAAI/ACM AIES 2019).
Installation
python3 -m pip install gam
Get Started
First generate local attributions using your favorite technique, then:
>>> from gam.gam import GAM
>>> # for a quick example use `attributions_path="tests/test_attributes.csv"`
>>> # Input/Output: csv (columns: features, rows: local/global attribution)
>>> gam = GAM(attributions_path="<path_to_your_attributes>.csv", distance="spearman", k=2)
>>> gam.generate()
>>> gam.explanations
[[('height', .6), ('weight', .3), ('hair color', .1)],
[('weight', .9), ('weight', .05), ('hair color', .05)]]
>>> gam.subpopulation_sizes
[90, 10]
>>> gam.subpopulations
# global explanation assignment
[0, 1, 0, 0,...]
>>> gam.plot()
# bar chart of feature importance with subpopulation size
Tests
To run tests:
$ python -m pytest tests/
Contributors
We welcome Your interest in Capital One’s Open Source Projects (the “Project”). Any Contributor to the Project must accept and sign an Agreement indicating agreement to the license terms below. Except for the license granted in this Agreement to Capital One and to recipients of software distributed by Capital One, You reserve all right, title, and interest in and to Your Contributions; this Agreement does not impact Your rights to use Your own Contributions for any other purpose.
Code of Conduct
This project adheres to the Open Code of Conduct By participating, you are expected to honor this code.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.