Inference and server for local AI implementations of bpm-ai-core abstractions.
Project description
bpm-ai-inference
Extension to bpm-ai project for local AI inference.
Install platform specific dependencies
Linux
Install PyTorch (with CUDA GPU support) and spaCy:
$ pip install -r requirements.txt
or CPU-only:
$ pip install -r requirements.linux-cpu.txt
Apple Silicon:
$ pip install -r requirements.apple.txt
Install from PyPi
$ pip install bpm-ai-inference
License
This project is developed under
due to dependency licenses.
Sponsors and Customers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bpm_ai_inference-0.3.3.tar.gz
(32.0 kB
view hashes)
Built Distribution
Close
Hashes for bpm_ai_inference-0.3.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a18ee30874293b3e1130ce16846cd31aa2f0d2d317eb11f998a44740d2ff2c2c |
|
MD5 | c49ab2c0c8ebf5a0a2e7284a6fc844df |
|
BLAKE2b-256 | 769a5784ee85de4059b068f923de5caf210a2787580ac7cff31eaae15624e538 |