Skip to main content

AI Handler: An engine which wraps certain huggingface models

Project description

AI Handler

Upload Python Package Discord GitHub GitHub last commit GitHub issues GitHub closed issues GitHub pull requests GitHub closed pull requests

This is a simple framework for running AI models. It makes use of the huggingface API which gives you a queue, threading, a simple API, and the ability to run Stable Diffusion and LLMs seamlessly from your local hardware.

This is not intended to be used as a standalone application.

It can easily be extended and used to power interfaces or it can be run from the command line.

AI Handler is a work in progress. It powers two projects at the moment, but may not be ready for general use.

Installation

This is a work in progress.

Pre-requisites

System requirements

  • Windows 10+
  • Python 3.10.8
  • pip 23.0.1
  • CUDA toolkit 11.7
  • CUDNN 8.6.0.163
  • Cuda capable GPU
  • 16gb+ ram

Install

pip install torch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 --index-url https://download.pytorch.org/whl/cu117
pip install https://github.com/w4ffl35/diffusers/archive/refs/tags/v0.14.0.ckpt_fix.tar.gz
pip install https://github.com/w4ffl35/transformers/archive/refs/tags/tensor_fix-v1.0.2.tar.gz
pip install https://github.com/acpopescu/bitsandbytes/releases/download/v0.37.2-win.0/bitsandbytes-0.37.2-py3-none-any.whl
pip install aihandlerwindows

Optional

These are optional instructions for installing TensorRT and Deepspeed for Windows

Install Tensor RT:
  1. Download TensorRT-8.4.3.1.Windows10.x86_64.cuda-11.6.cudnn8.4
  2. Git clone TensorRT 8.4.3.1
  3. Follow their instructions to build TensorRT-8.4.3.1 python wheel
  4. Install TensorRT pip install tensorrt-*.whl
Install Deepspeed:
  1. Git clone Deepspeed 0.8.1
  2. Follow their instructions to build Deepspeed python wheel
  3. Install Deepspeed `pip install deepspeed-*.whl

Environment variables

  • AIRUNNER_ENVIRONMENT - dev or prod. Defaults to dev. This controls the LOG_LEVEL
  • LOG_LEVEL - FATAL for production, DEBUG for development. Override this to force a log level

Huggingface variables

Offline mode

These environment variables keep you offline until you need to download a model. This prevents unwanted online access and speeds up usage of huggingface libraries.

  • DISABLE_TELEMETRY Keep this set to 1 at all times. Huggingface collects minimal telemetry when downloading a model from their repository but this will keep it disabled. See more info in this github thread
  • HF_HUB_OFFLINE When loading a diffusers model, huggingface libraries will attempt to download an updated cache before running the model. This prevents that check from happening (long with a boolean passed to load_pretrained see the runner.py file for examples)
  • TRANSFORMERS_OFFLINE Similar to HF_HUB_OFFLINE but for transformers models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aihandlerwindows-1.8.18.tar.gz (40.4 kB view details)

Uploaded Source

Built Distribution

aihandlerwindows-1.8.18-py3-none-any.whl (42.4 kB view details)

Uploaded Python 3

File details

Details for the file aihandlerwindows-1.8.18.tar.gz.

File metadata

  • Download URL: aihandlerwindows-1.8.18.tar.gz
  • Upload date:
  • Size: 40.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for aihandlerwindows-1.8.18.tar.gz
Algorithm Hash digest
SHA256 66266f691d330ec990a05272aed583fc6f40101cf188b86d07fda1483032c140
MD5 68c34d723f6bc1500b7fafcf612a66a7
BLAKE2b-256 ab65bcda22e75cc0ed81a5111cd23e57d67314a224122ed20b77a04ed1929a27

See more details on using hashes here.

File details

Details for the file aihandlerwindows-1.8.18-py3-none-any.whl.

File metadata

File hashes

Hashes for aihandlerwindows-1.8.18-py3-none-any.whl
Algorithm Hash digest
SHA256 4283ca9b4240b1f024d34989b7164c5d3174ec8ba3dac62749f1dddb561b0a88
MD5 0628ebc32cabe31f21353a3e882deb9c
BLAKE2b-256 ccbdb11a378cf7faef1f91aea00c7ef21cfc70f3dd6434b6a80c8e607434587b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page