Inference and server for local AI implementations of bpm-ai-core abstractions.
Project description
bpm-ai-inference
Extension to bpm-ai project for local AI inference.
Install platform specific dependencies
Linux
Install PyTorch (with CUDA GPU support) and spaCy:
$ pip install -r requirements.txt
or CPU-only:
$ pip install -r requirements.linux-cpu.txt
Apple Silicon:
$ pip install -r requirements.apple.txt
Install from PyPi
$ pip install bpm-ai-inference
License
This project is developed under
due to dependency licenses.
Sponsors and Customers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bpm_ai_inference-0.3.5.tar.gz
(32.1 kB
view hashes)
Built Distribution
Close
Hashes for bpm_ai_inference-0.3.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d0169faacb079c407b5e0083a27530addf6ce9515a227800c89b0f41cdbf16ea |
|
MD5 | 8dae6d97a00f049eaaef638e22f53808 |
|
BLAKE2b-256 | 4834daf2f55729c267d6dcdd351e30a7e679349156a5b18151ad8b3108f691a5 |