Inference and server for local AI implementations of bpm-ai-core abstractions.
Project description
bpm-ai-inference
Extension to bpm-ai project for local AI inference.
Install platform specific dependencies
Linux
Install PyTorch (with CUDA GPU support) and spaCy:
$ pip install -r requirements.txt
or CPU-only:
$ pip install -r requirements.linux-cpu.txt
Apple Silicon:
$ pip install -r requirements.apple.txt
Install from PyPi
$ pip install bpm-ai-inference
License
This project is developed under
due to dependency licenses.
Sponsors and Customers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bpm_ai_inference-0.3.2.tar.gz
(32.0 kB
view hashes)
Built Distribution
Close
Hashes for bpm_ai_inference-0.3.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d5194f9d39c2b7db741e28cb8708e4b5c659200020c21d428d3d6ed41c04fdaa |
|
MD5 | b67c605eac7a702d717638845f74b189 |
|
BLAKE2b-256 | ff637220a73d594b116547cc9c1862eefaa3b1b20aa1dce9e80297297925793f |