Skip to main content

Andromeda - Pytorch

Project description

Multi-Modality

Andromeda: Ultra-Fast and Ultra-Intelligent SOTA Language Model 🚀🌌

Open Bounties Rewarded Bounties GitHub issues GitHub forks GitHub stars GitHub license Share on Twitter Share on Facebook Share on LinkedIn Discord Share on Reddit Share on Hacker News Share on Pinterest Share on WhatsApp

Welcome to Andromeda, The Fastest, Most Creative, and Reliable Language Model Ever Built, train your own verison, conduct inference, and finetune your own verison with simple plug in and play scripts get started in 10 seconds:

Features

  • 💼 Handle Ultra Long Sequences (32,000-200,000+ context lengths)
  • ⚡ Ultra Fast Processing (32,000+ tokens in under 100ms)
  • 🎓 Superior Reasoning Capabilities

🎯 Principles

  • Efficiency: Optimize with techniques like attention flashing, rotary position encodings, and deep normalization.
  • Flexibility: Adapt to various tasks and domains for wide applications.
  • Scalability: Designed to scale with resources and data sizes.
  • Community-Driven: Thrives on contributions from the open-source community.

💻 Install

python3.11 -m pip install --upgrade andromeda-torch

Usage

  • Forward pass with random inputs
import torch

from andromeda.configs import Andromeda1Billion

model = Andromeda1Billion()

x = torch.randint(0, 256, (1, 1024)).cuda()

out = model(x)  # (1, 1024, 20000)
print(out)
  • Tokenized inputs
from andromeda_torch import Tokenizer
from andromeda_torch.configs import Andromeda1Billion

model = Andromeda1Billion()
tokenizer = Tokenizer()

encoded_text = tokenizer.encode("Hello world!")
out = model(encoded_text)
print(out)

📚 Training

  1. Set the environment variables:

    • ENTITY_NAME: Your wandb project name
    • OUTPUT_DIR: Directory to save the weights (e.g., ./weights)
    • MASTER_ADDR: For distributed training
    • MASTER_PORT For master port distributed training
    • RANK- Number of nodes services
    • WORLD_SIZE Number of gpus
  2. Configure the training:

    • Accelerate Config
    • Enable Deepspeed 3
    • Accelerate launch train_distributed_accelerate.py

For more information, refer to the Training SOP.


Todo

  • Add Yarn Embeddings from zeta

📈 Benchmarks

Speed

  • Andromeda utilizes one of the most reliable Attentions ever, flash attention 2.0 Triton. It consumes 50x less memory than GPT-3 and 10x less than LLAMA.

AndromedaBanner

  • We can speed this up even more with dynamic sparse flash attention 2.0.

License

Apache License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

andromeda_torch-0.0.6.tar.gz (96.4 kB view details)

Uploaded Source

Built Distribution

andromeda_torch-0.0.6-py3-none-any.whl (55.8 kB view details)

Uploaded Python 3

File details

Details for the file andromeda_torch-0.0.6.tar.gz.

File metadata

  • Download URL: andromeda_torch-0.0.6.tar.gz
  • Upload date:
  • Size: 96.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/22.4.0

File hashes

Hashes for andromeda_torch-0.0.6.tar.gz
Algorithm Hash digest
SHA256 3db468e86bd5ccf9b9558d512558ed77209a2e7c56ac9bc89ba010c001b54433
MD5 814a48ab472e8d884a14c02c2e57be06
BLAKE2b-256 9a94ba1f0cf1e83046bfc50f32b0deb1f46470abdecc1df0554bc9a6d6fcb484

See more details on using hashes here.

File details

Details for the file andromeda_torch-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: andromeda_torch-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 55.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.11.0 Darwin/22.4.0

File hashes

Hashes for andromeda_torch-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 5db7197c592f5504321cbe8939c52d09f268e05ed18cd5b4aaef1ae61fbf8006
MD5 456dfd754693b6888a923df538f4e953
BLAKE2b-256 6dca2870be26c7b3445d1b57dcf6306c321ab44b13c86d4ca0b470bf30fa03b9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page