Python-based Machine Learning Interface
Project description
mlipy
Pure Python-based Machine Learning Interface for multiple engines with multi-modal support.
Python HTTP Server/Client (including WebSocket streaming support) for:
Prerequisites
Debian/Ubuntu
sudo apt update -y
sudo apt install build-essential git curl libssl-dev libffi-dev pkg-config
Python
- Install Python using internal repository:
sudo apt install python3.11 python3.11-dev python3.11-venv
- Install Python using external repository:
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update -y
sudo apt install python3.11 python3.11-dev python3.11-venv
llama.cpp
cd ~
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
make -j
Run Development Server
Setup virtualenv and install requirements:
git clone https://github.com/mtasic85/mlipy.git
cd mlipy
python3.11 -m venv venv
source venv/bin/activate
pip install poetry
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
poetry install
Run server:
python -B -m mli.server --llama-cpp-path='~/llama.cpp'
Run Examples
Using GPU:
NGL=99 python -B examples/sync_demo.py
Using CPU:
python -B examples/sync_demo.py
python -B examples/async_demo.py
python -B examples/langchain_sync_demo.py
python -B examples/langchain_async_demo.py
Run Production Server
Generate self-signed SSL certificates
openssl req -x509 -nodes -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365
Run
python3.11 -m venv venv
source venv/bin/activate
pip install -U mlipy
python -B -m mli.server
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mlipy-0.1.56.tar.gz
(15.0 kB
view hashes)
Built Distribution
mlipy-0.1.56-py3-none-any.whl
(15.6 kB
view hashes)