Skip to main content
Join the official 2020 Python Developers SurveyStart the survey!

This package is a brief wrap-up toolkit built based on 2 explanation packages: LIME and SHAP. The package contains 2 explainers: LIMEBAG and SHAP. It takes data and fitted models as input and returns explanations about feature importance ranks and/or weights. (etc. what attributes matter most within the prediction model).

Project description

LAS

This package is a brief wrap-up toolkit built based on 2 explanation packages: LIME and SHAP. The package contains 2 explainers: LIMEBAG and SHAP. It takes data and fitted models as input and returns explanations about feature importance ranks and/or weights. (etc. what attributes matter most within the prediction model).

rq1.py

The demo runs LIMEBAG on a default dataset. It generates and presents explanations about feature importance ranks and weights for all testing data points. Can be called by LIMEBAG.demo1()

rq2.py

The demo uses the explanations returned from LIMEBAG to run an effect size test. A summary of feature importance ranks and weights will be generated and presented as output. Can be called by LIMEBAG.demo2()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for LASExplanation, version 0.0.1
Filename, size File type Python version Upload date Hashes
Filename, size LASExplanation-0.0.1-py3-none-any.whl (30.0 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size LASExplanation-0.0.1.tar.gz (12.7 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page