A collection of inference modules
Project description
fastinference
A collection of inference modules for fastai2 including inference speedup and interpretability
Install
We have 4 modules you can install, depending on your usecase:
- Base Library (Just inference stuff):
pip install fastinference
- ONNX Inference:
pip install fastinference[onnx-cpu]
orfastinference[onnx-gpu]
- Interperatability (Class Confusion + SHAP):
pip install fastinference[interp]
- Everything:
pip install fastinference[all]
Wonderful Contributors:
Directions for Contributing:
- Fork this repository into your GitHub Account
- Ensure that
nbdev
is installed on your system - Make any changes and ensure that you run the following before commiting:
nbdev_build_lib
nbdev_clean_nbs
- Open a Pull Request with the library, and choose "From fork" to open one with the main repository.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
fastinference-0.0.21.tar.gz
(29.2 kB
view hashes)
Built Distribution
Close
Hashes for fastinference-0.0.21-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d5ce4b3b2123f58389ffc09b0fde5e1efa30e19d6882926ae3e6bf79f9721ef9 |
|
MD5 | cc6e3c35aac7a4a82f582fb8fa9ce7bf |
|
BLAKE2b-256 | 63c723458e22b3c8c2db06fc0d45e1abe9ea6018d5c77105db049d8957672b71 |