Skip to main content

GBA Model Toolkit for MLX

Project description

GBA Model Toolkit for MLX

Introduction

Welcome to the GreenBitAI (GBA) Model Toolkit for MLX! This comprehensive Python package not only facilitates the conversion of GreenBitAI's Low-bit Language Models (LLMs) to MLX framework compatible format but also supports generation, model loading, and other essential scripts tailored for GBA quantized models. Designed to enhance the integration and deployment of GBA models within the MLX ecosystem, this toolkit enables the efficient execution of GBA models on a variety of platforms, with special optimizations for Apple devices to enable local inference and natural language content generation.

Features

This toolkit represents a significant step forward in the usability of GreenBitAI models, making it simpler for developers and researchers to incorporate these models into their MLX-based projects.

  • Conversion: Utilize gba2mlx.py to convert models from GBA format to a format compatible with the MLX framework, ensuring smooth integration and optimal performance.
  • Generation: Includes scripts for generating content using GBA quantized models within the MLX environment, empowering users to leverage the advanced capabilities of GBA models for natural language content creation.

Installation

To get started with this package, simply run:

pip install gbx-lm

or clone the repository and install the required dependencies (for Python >= 3.9):

git clone https://github.com/GreenBitAI/gbx-lm.git
pip install -r requirements.txt

Alternatively you can also use the prepared conda environment configuration:

conda env create -f environment.yml
conda activate gbai_mlx_lm

Usage

Converting Models

To convert a GreenBitAI's Low-bit LLM to the MLX format, run:

python -m gbx_lm.gba2mlx --hf-path <input file path or a Hugging Face repo> --mlx-path <output file path> --hf-token <your huggingface token> --upload-repo <a Hugging Face repo name>

Generating Content

To generate natural language content using a converted model:

python -m gbx_lm.generate --model <path to a converted model or a Hugging Face repo name>

Requirements

  • Python 3.x
  • See requirements.txt or environment.yml for a complete list of dependencies

Examples

In this example, the pretrained 4-bit model "yi-6b-chat-w4a16g128" will be downloaded from GreenBitAI's Hugging Face repository and converted into mlx compatible format, and saved in the local directory "yi-6b-chat-w4a16g128-mlx". We can also use the "--upload-repo" parameter to provide a Hugging Face repo URL with valid write permissions. This will directly upload the model converted and saved locally to this Hugging Face repo.

python -m gbx_lm.gba2mlx --hf-path GreenBitAI/yi-6b-chat-w4a16g128 --mlx-path yi-6b-chat-w4a16g128-mlx/ --hf-token <your huggingface token> --upload-repo GreenBitAI/yi-6b-chat-w4a16g128-mlx

It will download and run the local model to generate natural language content through prompts given by users.

python -m gbx_lm.generate --model GreenBitAI/yi-6b-chat-w4a16g128-mlx  --max-tokens 100 --prompt "calculate 4*8+1024=" --eos-token '<|im_end|>'

License

The original code was released under its respective license and copyrights, i.e.:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gbx-lm-0.1.2.tar.gz (24.8 kB view details)

Uploaded Source

Built Distribution

gbx_lm-0.1.2-py3-none-any.whl (29.9 kB view details)

Uploaded Python 3

File details

Details for the file gbx-lm-0.1.2.tar.gz.

File metadata

  • Download URL: gbx-lm-0.1.2.tar.gz
  • Upload date:
  • Size: 24.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for gbx-lm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 df0c8375f531146e6e556862d5baed7d8648389943c66b0c02538a3cff0ac839
MD5 55d89f7508707c80a7d11151e45b257a
BLAKE2b-256 957bd433ea2a6b3965d9b0f604066d5ab3c5c65aba422210384198d7c0ce7977

See more details on using hashes here.

File details

Details for the file gbx_lm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: gbx_lm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 29.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for gbx_lm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a5257f19c062435fd484575b5e45d0dc57ad63192d1480dc7b733ddd3d5a77bd
MD5 7b5bcbf0fb287703b08d067bc058488a
BLAKE2b-256 0a21ea24d7e0731fae09f46a8b0d8f881c7c3a3191d6806d968015d894458c0e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page