Inference and server for local AI implementations of bpm-ai-core abstractions.
Project description
bpm-ai-inference
Extension to bpm-ai project for local AI inference.
Install platform specific dependencies
Linux
Install PyTorch (with CUDA GPU support) and spaCy:
$ pip install -r requirements.txt
or CPU-only:
$ pip install -r requirements.linux-cpu.txt
Apple Silicon:
$ pip install -r requirements.apple.txt
Install from PyPi
$ pip install bpm-ai-inference
License
This project is developed under
due to dependency licenses.
Sponsors and Customers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bpm_ai_inference-0.3.0.tar.gz
(31.7 kB
view hashes)
Built Distribution
Close
Hashes for bpm_ai_inference-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 404f78b47e5412fe252bc7b7200c2d483aced82191d6da9e5be00a651f63cbb9 |
|
MD5 | 6053f02bafb902206d89458716073cad |
|
BLAKE2b-256 | 2e25b0d5e2472a7c3942cefd54216974b2a0cacf47a137baba97c193efd7aea9 |