Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus

Terminus is an agent for terminal-based task execution. It integrates with Harbor for agent evaluation and task execution.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)
  • Harbor framework installed

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

For local development:

pip install -e /path/to/terminus

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • For testing and simple tasks, the CLI is quick and convenient
  • For production use cases and complex evaluations, integration with Harbor is recommended (see below)

Usage with Harbor

Terminus is designed to work as an external agent with Harbor. You can use it in several ways:

Option 1: Using import path (Recommended)

When configuring your Harbor task, use the import path to load the Terminus agent:

agent:
  import_path: "terminus:Terminus"
  model_name: "anthropic/claude-sonnet-4"
  kwargs:
    parser_name: "json"  # or "xml"
    temperature: 0.7
    max_turns: 100
    enable_summarize: true

Or in Python:

from harbor import AgentFactory
from pathlib import Path

agent = AgentFactory.create_agent_from_import_path(
    import_path="terminus:Terminus",
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",
    temperature=0.7,
)

Option 2: Direct instantiation

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-2.0.2.tar.gz (33.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-2.0.2-py3-none-any.whl (38.2 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-2.0.2.tar.gz.

File metadata

  • Download URL: terminus_ai-2.0.2.tar.gz
  • Upload date:
  • Size: 33.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-2.0.2.tar.gz
Algorithm Hash digest
SHA256 04729ac83fa2f3b5ceda2bc08c1be575f3ec0ed2a39bdd83351612c70f2841a5
MD5 edf2fb95a97f7761c37309e967ebcfc0
BLAKE2b-256 51b0dde861a072e0fbdf564cf41ae49e4a2993f349cbbea9cde0348331bf9bee

See more details on using hashes here.

File details

Details for the file terminus_ai-2.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-2.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 47ac6ed6bdaa1d0f8747a40f0ad9d5d2fca5d6bced42260bf46e19a7e1553a95
MD5 331dfdd45b990684436a3a25c30e09d3
BLAKE2b-256 aabf307e4234dba17c93d015e7fbbb3c322b88424bd7bd88ae97b92791121bb0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page