Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus

Terminus is an agent for terminal-based task execution. It integrates with Harbor for agent evaluation and task execution.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)
  • Harbor framework installed

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

For local development:

pip install -e /path/to/terminus

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • For testing and simple tasks, the CLI is quick and convenient
  • For production use cases and complex evaluations, integration with Harbor is recommended (see below)

Usage with Harbor

Terminus is designed to work as an external agent with Harbor. You can use it in several ways:

Option 1: Using import path (Recommended)

When configuring your Harbor task, use the import path to load the Terminus agent:

agent:
  import_path: "terminus:Terminus"
  model_name: "anthropic/claude-sonnet-4"
  kwargs:
    parser_name: "json"  # or "xml"
    temperature: 0.7
    max_turns: 100
    enable_summarize: true

Or in Python:

from harbor import AgentFactory
from pathlib import Path

agent = AgentFactory.create_agent_from_import_path(
    import_path="terminus:Terminus",
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",
    temperature=0.7,
)

Option 2: Direct instantiation

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-2.0.0.tar.gz (33.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-2.0.0-py3-none-any.whl (38.2 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-2.0.0.tar.gz.

File metadata

  • Download URL: terminus_ai-2.0.0.tar.gz
  • Upload date:
  • Size: 33.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-2.0.0.tar.gz
Algorithm Hash digest
SHA256 014ea46ad98511f8c4852e4c787f75ba68194ab1a9da65c0aa9d2ebeaeb02f74
MD5 1f03c48d7314655def1431e4a5e680e5
BLAKE2b-256 ede4f4766ac6483057844121fab0181da761e472a84625def554f6abd40d7a34

See more details on using hashes here.

File details

Details for the file terminus_ai-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f89a97504ae192d95e6154fb049249eccae41b96ed045ecc6e9ce1756798cfc1
MD5 62a9d810dc387716aa00bf01b7bedc2b
BLAKE2b-256 bcde28b4559b6e9bf6d6b8fbf0b197d7efd877c0db6e085e016342db10a890e3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page