Holistic AI Library
Project description
Holistic AI
The Holistic AI library is an open-source tool to assess and improve the trustworthiness of AI systems.
Currently, the library offers a set of techniques to easily measure and mitigate Bias across numerous tasks. In the future, it will be extended to include tools for Efficacy, Robustness, Privacy and Explainability as well. This will allow a holistic assessment of AI systems.
- Documentation: https://holistic-ai.readthedocs.io/en/latest/
- Tutorials: https://github.com/holistic-ai/holisticai/tree/main/tutorials
- Source code: https://github.com/holistic-ai/holisticai/tree/main
- Holistic Ai website: https://holisticai.com
Installation:
For metrics, you can use the default installation:
pip install holisticai # basic installation
pip install holisticai[bias] # bias mitigation support
pip install holisticai[explainability] # for explainability metrics and plots
pip install holisticai[all] # install all packages for bias and explainability
Troubleshooting
on macOS could be necessary some packages before install holisticai library:
brew install cbc pkg-config
python -m pip install cylp
brew install cmake
Explainability Visualization Tools
Install GraphViz
sudo apt update
sudo apt-get install graphviz
Contributing
We welcome contributions to the Holistic AI library. If you are interested in contributing, please refer to our contributing guide.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for holisticai-0.7.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | dceab73b6886d5fc92353fdd008e17ccbaf180567bbf6edd3f4989f5fd2694bb |
|
MD5 | 5dff9f5b09af3874433103a88588be72 |
|
BLAKE2b-256 | deb8efd449564b43a2b2f8f6cd20246b432d35ca435b8893d2c0ed24478f0646 |