Inference and server for local AI implementations of bpm-ai-core abstractions.
Project description
bpm-ai-inference
Extension to bpm-ai project for local AI inference.
Install platform specific dependencies
Linux
Install PyTorch (with CUDA GPU support) and spaCy:
$ pip install -r requirements.txt
or CPU-only:
$ pip install -r requirements.linux-cpu.txt
Apple Silicon:
$ pip install -r requirements.apple.txt
Install from PyPi
$ pip install bpm-ai-inference
License
This project is developed under
due to dependency licenses.
Sponsors and Customers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bpm_ai_inference-0.2.8.tar.gz
(27.5 kB
view hashes)
Built Distribution
Close
Hashes for bpm_ai_inference-0.2.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | eac92f149c08d73b47982cf31271eb8adde4fc1c3de89a3d3468058f46b6bf1e |
|
MD5 | 9dd9a81aee2a3a02cc48788f33574719 |
|
BLAKE2b-256 | ed1d2f6f0a8d3de9a814de6944f387f69766718f30f0944f65b23223460cc0e1 |