Approximate Text Explanation
Project description
Approximate Text Explanation (ATE)
Transformation of TensorFlow text classification models into local interpretable models to explain the base model decisions via effects.
Related work and base for this idea:
- Paper: "Why Should I Trust You?": Explaining the Predictions of Any Classifier by Marco Tulio Ribeiro, Sameer Singh, Carlos Guestrin
- GitHub: marcotcr/lime
Approximate Local Text Explanation (ALTE)
"Approximate Local Text Explanation" is based on the LIME method. The goal of the approach is the derivation of effects or influences of the input components (textual data) on the respective output (classification label). This is a local explanation procedure in which a single input data point is analyzed. The components of this input data point (token) are activated or deactivated by permutations of a binary vector of the same size as the number of components of the input data point. All permutations are classified by the original classification model and stored in a meta dataset. This meta dataset is then used to train a linear classification model, thus linearly approximating the original classification function. Since the computation of all permutations of the components of the input data point is very computationally expensive, the permutation upper bound, the permutation repetitions and the epochs of the linear model can be defined via configuration parameters. In addition, the permutation process is iteratively repeated and the permutations are randomized, which also makes it possible to refine the linear model in the long run.
Approximate Global Text Explanation (AGTE)
"Approximate Global Text Explanation" is also based on the LIME method. The goal of the approach is to infer effects or influences of the input components (textual data) of all input data points in the data set on the respective output (classification label). This is a global explanation procedure in which the effects of all input data points are analyzed. AGTE uses the approach of ALTE for this and extends it with an N-fold execution. Accordingly, several linear models are trained which represent the original classification function at different points. The more data points exist and the more computational capacity is available (i.e. the more permutations can be calculated and classified), the better the approximation of the original model. In addition, a pipeline is implemented, with the help of which the effect representation (token and its effect on certain classes) can be converted into a decision rule set (DecisionTree). MIT License
Copyright (c) 2023 Tjark Prokoph
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tf-ate-1.0.5.tar.gz.
File metadata
- Download URL: tf-ate-1.0.5.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2352e50cb2186f372565dbcdc0f7f9be9335a3948e727243f48c988cdd07f689
|
|
| MD5 |
43510779e987523b8f1416552739cee8
|
|
| BLAKE2b-256 |
e070d07de8382688e6727bd56885bb4d51edbf1079240801c5a02b3a3d3230ef
|
File details
Details for the file tf_ate-1.0.5-py3-none-any.whl.
File metadata
- Download URL: tf_ate-1.0.5-py3-none-any.whl
- Upload date:
- Size: 6.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
55c18f95f6596029590de9349aa15f04631eb2086df03db69260da7c5595f268
|
|
| MD5 |
5b21edf9b84309849f896c2e9474529a
|
|
| BLAKE2b-256 |
17d4a60813c581e70cd5208c7cf8171d6b2af2ebe6b02e3f5de5c5998cd1665d
|