Python bindings to the TrustyAI explainability library.
Project description
python-trustyai
Python bindings to TrustyAI's explainability library.
Setup
PyPi
Install from PyPi with
pip install trustyai
Local
The minimum dependencies can be installed with
pip install -r requirements.txt
If running the examples or developing, also install the development dependencies:
pip install -r requirements-dev.txt
Docker
Alternatively create a container image and run it using
$ docker build -f Dockerfile -t ruivieira/python-trustyai:latest .
$ docker run --rm -it -p 8888:8888 ruivieira/python-trustyai:latest
The Jupyter server will be available at localhost:8888
.
Binder
You can also run the example Jupyter notebooks
using mybinder.org
:
Documentation
Check out the ReadTheDocs page for API references and examples.
Getting started
To initialise, import the module and initialise it. For instance,
import trustyai
trustyai.init()
If the dependencies are not in the default dep
sub-directory, or you want to use a custom classpath you can specify it
with:
import trustyai
trustyai.init(path="/foo/bar/explainability-core-2.0.0-SNAPSHOT.jar")
In order to get all the project's dependencies, the script deps.sh
can be run and dependencies will be stored locally
under ./dep
.
This needs to be the very first call, before any other call to TrustyAI methods. After this, we can call all other methods, as shown in the examples.
Writing your model in Python
To code a model in Python you need to write it a function with takes a Python list of PredictionInput
and returns a (
Python) list of PredictionOutput
.
This function will then be passed as an argument to the Python PredictionProvider
which will take care of wrapping it in a Java CompletableFuture
for you. For instance,
from trustyai.model import Model
def myModelFunction(inputs):
# do something with the inputs
output = [predictionOutput1, predictionOutput2]
return output
model = Model(myModelFunction)
inputs = [predictionInput1, predictionInput2]
prediction = model.predictAsync(inputs).get()
You can see the sumSkipModel
in the LIME tests.
Examples
You can look at the tests for working examples.
There are also Jupyter notebooks available.
Contributing
To install trustyai
for local development, use:
$ cd scripts
$ ./build.sh
This will compile the necessary Java libraries and install the TrustyAI Python package, locally.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for trustyai-0.2.11-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | da8e72655df7120eb90a42121e984ff0d0dc9562fcaf110ab29f0cfb2947271f |
|
MD5 | 3591bf50af3c1b1cc6f0d89cc46c910d |
|
BLAKE2b-256 | bfc30e4c5dc83c5cbec83fcca8331c949ec6541644cbffd124fe4855ed8b2bf9 |