Skip to main content

Python-based Machine Learning Interface

Project description

mlipy

Downloads Supported Versions License: MIT

Pure Python-based Machine Learning Interface for multiple engines with multi-modal support.

Python HTTP Server/Client (including WebSocket streaming support) for:

Prerequisites

Debian/Ubuntu

sudo apt update -y
sudo apt install build-essential git curl libssl-dev libffi-dev pkg-config

Rust

  1. Using latest system repository:
sudo apt install rustc cargo
  1. Install rustup using official instructions:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
rustup default stable

Python

  1. Install Python using internal repository:
sudo apt install python3.11 python3.11-dev python3.11-venv
  1. Install Python using external repository:
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update -y
sudo apt install python3.11 python3.11-dev python3.11-venv

Arch/Manjaro

Rust

  1. Using latest system-wide rust/cargo:
sudo pacman -Sy base-devel openssl libffi git rust cargo rust-wasm wasm-bindgen
  1. Using latest rustup:
sudo pacman -Sy base-devel openssl libffi git rustup
rustup default stable

macOS

brew update
brew install rustup
rustup default stable

llama.cpp

cd ~
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
make

candle

cd ~
git clone https://github.com/huggingface/candle.git
cd candle
find candle-examples -type f -exec sed -i 's/println/eprintln/g' {} +
cargo clean
cargo build -r --bins --examples

Run Development Server

Setup virtualenv and install requirements:

git clone https://github.com/mtasic85/mlipy.git
cd mlipy

python3.11 -m venv venv
source venv/bin/activate
pip install poetry
poetry install

Download models:

# NOTE: login in case you need to accept terms and conditions for some models
# huggingface-cli login

# phi
huggingface-cli download microsoft/phi-1_5
huggingface-cli download Open-Orca/oo-phi-1_5
huggingface-cli download lmz/candle-quantized-phi

# stable-lm
huggingface-cli download stabilityai/stablelm-3b-4e1t
huggingface-cli download lmz/candle-stablelm-3b-4e1t

# mistral
huggingface-cli download TheBloke/Mistral-7B-v0.1-GGUF mistral-7b-v0.1.Q4_K_M.gguf
huggingface-cli download TheBloke/Mistral-7B-Instruct-v0.1-GGUF mistral-7b-instruct-v0.1.Q4_K_M.gguf
huggingface-cli download TheBloke/zephyr-7B-beta-GGUF zephyr-7b-beta.Q4_K_M.gguf
huggingface-cli download TheBloke/Yarn-Mistral-7B-128k-GGUF yarn-mistral-7b-128k.Q4_K_M.gguf
huggingface-cli download lmz/candle-mistral

# llama2
# huggingface-cli download meta-llama/Llama-2-7b-hf
huggingface-cli download TheBloke/Orca-2-7B-GGUF orca-2-7b.Q4_K_M.gguf
huggingface-cli download TheBloke/Llama-2-7B-GGUF llama-2-7b.Q4_K_M.gguf
huggingface-cli download TheBloke/Llama-2-7B-Chat-GGUF llama-2-7b-chat.Q4_K_M.gguf
huggingface-cli download TheBloke/Yarn-Llama-2-7B-128K-GGUF yarn-llama-2-7b-128k.Q4_K_M.gguf
huggingface-cli download afrideva/TinyLlama-1.1B-Chat-v0.6-GGUF tinyllama-1.1b-chat-v0.6.q4_k_m.gguf
huggingface-cli download afrideva/TinyLlama-1.1B-intermediate-step-955k-token-2T-GGUF tinyllama-1.1b-intermediate-step-955k-token-2t.q4_k_m.gguf

# stable-lm
huggingface-cli download afrideva/stablelm-3b-4e1t-GGUF stablelm-3b-4e1t.q4_k_m.gguf
huggingface-cli download TheBloke/rocket-3B-GGUF rocket-3b.Q4_K_M.gguf

# code mistral-based
# huggingface-cli download codellama/CodeLlama-7b-Python-hf
huggingface-cli download TheBloke/sqlcoder-7B-GGUF sqlcoder-7b.Q4_K_M.gguf
huggingface-cli download TheBloke/deepseek-coder-1.3b-instruct-GGUF deepseek-coder-1.3b-instruct.Q4_K_M.gguf
huggingface-cli download TheBloke/deepseek-coder-6.7B-instruct-GGUF deepseek-coder-6.7b-instruct.Q4_K_M.gguf
huggingface-cli download TheBloke/tora-code-7B-v1.0-GGUF tora-code-7b-v1.0.Q4_K_M.gguf

Run server:

python -B -m mli.server

Run Examples

python -B examples/sync_demo.py
python -B examples/async_demo.py
python -B examples/langchain_sync_demo.py
python -B examples/langchain_async_demo.py

Run Production Server

python3.11 -m venv venv
source venv/bin/activate
pip install -U mlipy
python -B -m mli.server

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlipy-0.1.7.tar.gz (10.6 kB view hashes)

Uploaded Source

Built Distribution

mlipy-0.1.7-py3-none-any.whl (10.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page