Skip to main content

Llama CLI

Project description

llama-toolchain

PyPI - Downloads Discord

This repo contains the API specifications for various components of the Llama Stack as well implementations for some of those APIs like model inference.

The Llama Stack consists of toolchain-apis and agentic-apis. This repo contains the toolchain-apis.

Installation

You can install this repository as a package with pip install llama-toolchain

If you want to install from source:

mkdir -p ~/local
cd ~/local
git clone git@github.com:meta-llama/llama-toolchain.git

conda create -n toolchain python=3.10
conda activate toolchain

cd llama-toolchain
pip install -e .

The Llama CLI

The llama CLI makes it easy to configure and run the Llama toolchain. Read the CLI reference for details.

Appendix: Running FP8

If you want to run FP8, you need the fbgemm-gpu package which requires torch >= 2.4.0 (currently only in nightly, but releasing shortly...)

ENV=fp8_env
conda create -n $ENV python=3.10
conda activate $ENV

pip3 install -r fp8_requirements.txt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_cmdline-0.0.1rc1.tar.gz (31.3 kB view hashes)

Uploaded Source

Built Distribution

llama_cmdline-0.0.1rc1-py3-none-any.whl (51.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page