Skip to main content

CLI tool for Luma Dream Machine AI Video Generation via AceDataCloud API

Project description

Luma CLI

PyPI version PyPI downloads Python 3.10+ License: MIT CI

A command-line tool for AI video generation using Luma through the AceDataCloud API.

Generate AI videos directly from your terminal — no MCP client required.

Features

  • Video Generation — Generate videos from text prompts with multiple models
  • Image-to-Video — Create videos from reference images- Video Extension — Extend existing videos
  • Multiple Models — luma
  • Task Management — Query tasks, batch query, wait with polling
  • Rich Output — Beautiful terminal tables and panels via Rich
  • JSON Mode — Machine-readable output with --json for piping

Quick Start

1. Get API Token

Get your API token from AceDataCloud Platform:

  1. Sign up or log in
  2. Navigate to the Luma API page
  3. Click "Acquire" to get your token

2. Install

# Install with pip
pip install luma-pro-cli

# Or with uv (recommended)
uv pip install luma-pro-cli

# Or from source
git clone https://github.com/AceDataCloud/LumaCli.git
cd LumaCli
pip install -e .

3. Configure

# Set your API token
export ACEDATACLOUD_API_TOKEN=your_token_here

# Or use .env file
cp .env.example .env
# Edit .env with your token

4. Use

# Generate a video
luma generate "A test video"

# Generate from reference image
luma image-to-video "Animate this scene" -i https://example.com/photo.jpg

# Extend a video
luma extend <video-id>

# Check task status
luma task <task-id>

# Wait for completion
luma wait <task-id> --interval 5

# List available models
luma models

Commands

Command Description
luma generate <prompt> Generate a video from a text prompt
luma image-to-video <prompt> -i <url> Generate a video from reference image(s)
luma extend <video_id> Extend an existing video
luma task <task_id> Query a single task status
luma tasks <id1> <id2>... Query multiple tasks at once
luma wait <task_id> Wait for task completion with polling
luma models List available Luma models
luma config Show current configuration
luma aspect-ratios List available aspect ratios

Global Options

--token TEXT    API token (or set ACEDATACLOUD_API_TOKEN env var)
--version       Show version
--help          Show help message

Most commands support:

--json          Output raw JSON (for piping/scripting)
--model TEXT    Luma model version (default: luma)
--timeout INT   Timeout in seconds for the API to return data

Available Models

Model Version Notes
luma Standard Standard quality video generation (default)

Configuration

Environment Variables

Variable Description Default
ACEDATACLOUD_API_TOKEN API token from AceDataCloud Required
ACEDATACLOUD_API_BASE_URL API base URL https://api.acedata.cloud
LUMA_DEFAULT_MODEL Default model luma
LUMA_REQUEST_TIMEOUT Timeout in seconds 1800

Development

Setup Development Environment

git clone https://github.com/AceDataCloud/LumaCli.git
cd LumaCli
python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev,test]"

Run Tests

pytest
pytest --cov=luma_cli
pytest tests/test_integration.py -m integration

Code Quality

ruff format .
ruff check .
mypy luma_cli

Docker

docker pull ghcr.io/acedatacloud/luma-pro-cli:latest
docker run --rm -e ACEDATACLOUD_API_TOKEN=your_token \
  ghcr.io/acedatacloud/luma-pro-cli generate "A test video"

Project Structure

LumaCli/
├── luma_cli/                # Main package
│   ├── __init__.py
│   ├── __main__.py            # python -m luma_cli entry point
│   ├── main.py                # CLI entry point
│   ├── core/                  # Core modules
│   │   ├── client.py          # HTTP client for Luma API
│   │   ├── config.py          # Configuration management
│   │   ├── exceptions.py      # Custom exceptions
│   │   └── output.py          # Rich terminal formatting
│   └── commands/              # CLI command groups
│       ├── video.py           # Video generation commands
│       ├── task.py            # Task management commands
│       └── info.py            # Info & utility commands
├── tests/                     # Test suite
├── .github/workflows/         # CI/CD (lint, test, publish to PyPI)
├── Dockerfile                 # Container image
├── deploy/                    # Kubernetes deployment configs
├── .env.example               # Environment template
├── pyproject.toml             # Project configuration
└── README.md

Luma CLI vs Luma MCP

Feature Luma CLI Luma MCP
Interface Terminal commands MCP protocol
Usage Direct shell, scripts, CI/CD Claude, VS Code, MCP clients
Output Rich tables / JSON Structured MCP responses
Automation Shell scripts, piping AI agent workflows
Install pip install luma-pro-cli pip install mcp-luma

Both tools use the same AceDataCloud API and share the same API token.

Contributing

Contributions are welcome! Please:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing)
  5. Open a Pull Request

Development Requirements

  • Python 3.10+
  • Dependencies: pip install -e ".[all]"
  • Lint: ruff check . && ruff format --check .
  • Test: pytest

License

This project is licensed under the MIT License — see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

luma_pro_cli-2026.4.5.0.tar.gz (14.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

luma_pro_cli-2026.4.5.0-py3-none-any.whl (14.1 kB view details)

Uploaded Python 3

File details

Details for the file luma_pro_cli-2026.4.5.0.tar.gz.

File metadata

  • Download URL: luma_pro_cli-2026.4.5.0.tar.gz
  • Upload date:
  • Size: 14.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for luma_pro_cli-2026.4.5.0.tar.gz
Algorithm Hash digest
SHA256 646f48d82676efb76cd625a4ba145e6c236658a0d8ca3d597c48cb6b5088692b
MD5 5bcf6692640137268156d77eb9b4e60e
BLAKE2b-256 5357dbd8e127f774f9ce999e6c21c5c0bbc408071687e70782e9619e8c1a8c9e

See more details on using hashes here.

File details

Details for the file luma_pro_cli-2026.4.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for luma_pro_cli-2026.4.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 702d5d6117dd7c6725afa60d4ed158da5618f385e64baeebb61e4fe4fdb00bb1
MD5 bb47d2a8d364c0ba77a4cd63f10ee6b3
BLAKE2b-256 a7da1c43120df94f4910fd6f75c6aa1576c84fd2876f94764eb37ad410a2790e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page