Skip to main content

Exa - Pytorch

Project description

Multi-Modality

Exa

Ultra-optimized fast inference library for running exascale LLMs locally on modern consumer-class GPUs.

Principles

  • Radical Simplicity (Utilizing super-powerful LLMs with as minimal code as possible)
  • Ultra-Optimizated (High Performance classes that extract all the power from these LLMs)
  • Fludity & Shapelessness (Plug in and play and re-architecture as you please)

🤝 Schedule a 1-on-1 Session

Book a 1-on-1 Session with Kye, the Creator, to discuss any issues, provide feedback, or explore how we can improve Exa for you.


📦 Installation 📦

You can install the package using pip

pip install exxa

Usage

Inference

from exa import Inference

model = Inference(
    model_id="georgesung/llama2_7b_chat_uncensored",
    quantized=True
)

model.run("What is your name")

GPTQ Inference

from exa import GPTQInference

model_id = "facebook/opt-125m"
model = GPTQInference(model_id=model_id, max_length=400)

prompt = "in a land far far away"
result = model.run(prompt)
print(result)

Quantize

from exa import Quantize

#usage
quantize = Quantize(
     model_id="bigscience/bloom-1b7",
     bits=8,
     enable_fp32_cpu_offload=True,
)

quantize.load_model()
quantize.push_to_hub("my model")
quantize.load_from_hub('my model')

🎉 Features 🎉

  • World-Class Quantization: Get the most out of your models with top-tier performance and preserved accuracy! 🏋️‍♂️

  • Automated PEFT: Simplify your workflow! Let our toolkit handle the optimizations. 🛠️

  • LoRA Configuration: Dive into the potential of flexible LoRA configurations, a game-changer for performance! 🌌

  • Seamless Integration: Designed to work seamlessly with popular models like LLAMA, Falcon, and more! 🤖


💌 Feedback & Contributions 💌

We're excited about the journey ahead and would love to have you with us! For feedback, suggestions, or contributions, feel free to open an issue or a pull request. Let's shape the future of fine-tuning together! 🌱


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

exxa-0.1.7.tar.gz (10.2 kB view details)

Uploaded Source

Built Distribution

exxa-0.1.7-py3-none-any.whl (11.1 kB view details)

Uploaded Python 3

File details

Details for the file exxa-0.1.7.tar.gz.

File metadata

  • Download URL: exxa-0.1.7.tar.gz
  • Upload date:
  • Size: 10.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/22.4.0

File hashes

Hashes for exxa-0.1.7.tar.gz
Algorithm Hash digest
SHA256 e57cff9bc54324b144be501b623b78b5846969e4d24534272030dcf402c8e3fa
MD5 cea362fe0b114214411810f2cf7c8bee
BLAKE2b-256 46fb1f3b356dbb5b0bd5bffbd7b720b9327df952eb91577d8221c638a5fd56d7

See more details on using hashes here.

File details

Details for the file exxa-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: exxa-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 11.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/22.4.0

File hashes

Hashes for exxa-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 834906e4e925955adcf4e24d586a69ca8dae794e58423941dc12b8bafc8b54eb
MD5 8cf856858bb4b14be1903930fce04756
BLAKE2b-256 a5cfcb0a3389854c3c8981b3516b50019abd81467933b20810f02f1025bd5e43

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page