Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus CLI

Terminus CLI is a CLI agent for terminal-based task execution. It can be used as a standalone tool, although it was primarily designed as a research-preview agent for evaluating the abilities of language models to power autonomous agents in the terminal.

Note that Terminus CLI is a fork of Terminus-2 agent, which is a built-in agent in terminal-bench. Terminus CLI 2.0.0 shows slightly better performance than Terminus-2 agent with openai/gpt-5 model on terminal-bench@2.0.0 benchmark. Different from Terminus-2, which by design runs its control logic outside task's container, Terminus CLI is a standalone library that runs entirely inside task's container.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

uv tool install terminus-ai

or

pip install terminus-ai

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • Perfect for quick tasks, testing, and automation

Programmatic Usage

You can also use Terminus programmatically in Python:

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-2.0.4.tar.gz (34.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-2.0.4-py3-none-any.whl (38.3 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-2.0.4.tar.gz.

File metadata

  • Download URL: terminus_ai-2.0.4.tar.gz
  • Upload date:
  • Size: 34.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-2.0.4.tar.gz
Algorithm Hash digest
SHA256 261e40aa7a82bfc9fc38cfdc057e32d423889b2602485d4bdd2aa8ab949c5dc7
MD5 38d9c931f343916f4a7eaeebbccff66b
BLAKE2b-256 8be636c6da8ec7cd44762bc4dad3737a3e3f03be0dd9f0cea2a5bb2e0e9d0b01

See more details on using hashes here.

File details

Details for the file terminus_ai-2.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-2.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6298a78fdf85418e132674157132311ea26f14fd7891baae71c1d84186e921ac
MD5 942791dc9aa3b79e1b286abc79b46d13
BLAKE2b-256 15fb31488e5a3b081017e729b185f40ab9846dad1b1b68b13cb954785e36d7f4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page