Skip to main content

Add your description here

Project description

vibe-llama

vibe-llama is a set of tools that are designed to help developers build working and reliable applications with LlamaIndex, LlamaCloud Services and llama-index-workflows.

This command-line tool aims to add the relevant context as rules to any coding agent of your choice (think Cursor, Claude Code, GitHub Copilot etc.):

  1. You select a coding agent.
  2. You select the LlamaIndex service (such as LlamaCloud, the LlamaIndex framework or the Workflows package)

Once you've made your choice, vibe-llama will generate a rule file for your coding agent. For example, if you selected Cursor, a new rule will be added to .cursor/rules. Now, all of the context and instructions about your chosen LlamaIndex service will be available to your coding agent of choice.

Installation

User settings

You can install and run vibe-llama using uv:

uvx vibe-llama@latest --help

Or you can use pip to install it first and run it in a second moment:

pip install vibe-llama

Developer settings

Clone the GitHub repository:

git clone https://github.com/run-llama/vibe-llama
cd vibe-llama

Build and install the project:

uv build

For regular installation:

uv pip install dist/*.whl

For editable installation (development):

# Activate virtual environment first
uv venv
source .venv/bin/activate  # On Unix/macOS

# Then install in editable mode
uv pip install -e .

Usage

vibe-llama is a CLI command, and has the following subcommands:

starter

starter provides your coding agents with up-to-date documentation about LlamIndex, LlamaCloud Services and llama-index-workflows, so that they can build reliable and working applications! You can launch a terminal user interface by running vibe-llama starter and select your desired coding agents and services from there, or you can directly pass your agent (-a, --agent flag) and chosen service (-s, --service flag) from command line interface.

Use the -v/--verbose flag (independently from TUI or CLI) if you want verbose logging of what processes are being executed while the application runs.

Example usage

vibe-llama starter # launch a TUI
vibe-llama starter -a 'GitHub Copilot' -s LlamaIndex -v # Select GitHub Copilot and LlamaIndex and enable verbose logging

More commands coming soon!🎉

SDK

vibe-llama also comes with a programmatic interface that you can call within your python scripts.

VibeLlamaStarter

To replicate the starter command on the CLI and fetch all the needed instructions for your coding agents, you can use the following code:

from vibe_llama.sdk import VibeLlamaStarter

starter = VibeLlamaStarter(
    agents=["GitHub Copilot", "Cursor"],
    services=["LlamaIndex", "llama-index-workflows"],
)

await starter.write_instructions(
    verbose=True, max_retries=20, retry_interval=0.7
)

Contributing

We welcome contributions! Please read our Contributing Guide to get started.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vibe_llama-0.2.0.tar.gz (40.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vibe_llama-0.2.0-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file vibe_llama-0.2.0.tar.gz.

File metadata

  • Download URL: vibe_llama-0.2.0.tar.gz
  • Upload date:
  • Size: 40.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.13

File hashes

Hashes for vibe_llama-0.2.0.tar.gz
Algorithm Hash digest
SHA256 65aefaa904eec6bdc7e7c6b53fa6004e01da30b6a6a937761f4fbfa4092687ef
MD5 3367c4740f298b79331bbdc0818551e1
BLAKE2b-256 169eb1c1dc3df68258ec2b8bbf915b46cee1e3fec145449c74b61b659e7597fd

See more details on using hashes here.

File details

Details for the file vibe_llama-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: vibe_llama-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.13

File hashes

Hashes for vibe_llama-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7424ed84c656054b20503f813f12650acf195b098549119d2ed3023ffe6c065a
MD5 61c2613c9ec8fa4441b44107a4122916
BLAKE2b-256 76add876601b24e0697e57bd063d9f977c1ef73fcb4437001f446931781ebf31

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page