A collection of inference modules
Project description
fastinference
A collection of inference modules for fastai including inference speedup and interpretability
Install
We have 4 modules you can install, depending on your usecase:
- Base Library (Just inference stuff):
pip install fastinference
- ONNX Inference:
pip install fastinference[onnx-cpu]
orfastinference[onnx-gpu]
- Interperatability (Class Confusion + SHAP):
pip install fastinference[interp]
- Everything:
pip install fastinference[all]
Wonderful Contributors:
Directions for Contributing:
- Fork this repository into your GitHub Account
- Ensure that
nbdev
is installed on your system - Make any changes and ensure that you run the following before commiting:
nbdev_build_lib
nbdev_clean_nbs
- Open a Pull Request with the library, and choose "From fork" to open one with the main repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
fastinference-0.0.23.tar.gz
(29.2 kB
view hashes)
Built Distribution
Close
Hashes for fastinference-0.0.23-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ddb7ed85fed9753eda09df9bc6d9bddb50215c5ebd2eca6caebfe2b00eab295a |
|
MD5 | 37c8fc5f41281d214ca68663d44e9e22 |
|
BLAKE2b-256 | 23802803fac91b9a81f35103ba30e8ce98add67fcdc29a493becf7a4979936b1 |