Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus

Terminus is an agent for terminal-based task execution. It integrates with Harbor for agent evaluation and task execution.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)
  • Harbor framework installed

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

For local development:

pip install -e /path/to/terminus

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • For testing and simple tasks, the CLI is quick and convenient
  • For production use cases and complex evaluations, integration with Harbor is recommended (see below)

Usage with Harbor

Terminus is designed to work as an external agent with Harbor. You can use it in several ways:

Option 1: Using import path (Recommended)

When configuring your Harbor task, use the import path to load the Terminus agent:

agent:
  import_path: "terminus:Terminus"
  model_name: "anthropic/claude-sonnet-4"
  kwargs:
    parser_name: "json"  # or "xml"
    temperature: 0.7
    max_turns: 100
    enable_summarize: true

Or in Python:

from harbor import AgentFactory
from pathlib import Path

agent = AgentFactory.create_agent_from_import_path(
    import_path="terminus:Terminus",
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",
    temperature=0.7,
)

Option 2: Direct instantiation

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-2.0.1.tar.gz (33.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-2.0.1-py3-none-any.whl (38.1 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-2.0.1.tar.gz.

File metadata

  • Download URL: terminus_ai-2.0.1.tar.gz
  • Upload date:
  • Size: 33.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-2.0.1.tar.gz
Algorithm Hash digest
SHA256 e7ba1f4845c789b248221d968b9f31e88994764b6d17137f7e5b4b9c717053aa
MD5 bf4fb0a48b29b07dc8e265459426f273
BLAKE2b-256 45afac8318d643e970c3f6626e0b966364f76777cce322b26deac7fb476811a2

See more details on using hashes here.

File details

Details for the file terminus_ai-2.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-2.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d33a3f57fe4f15cadf46f29caec0b52cbf9ecb0fba3a1043784e145352287b6c
MD5 af78e948996c6ca4ae1776feec87801b
BLAKE2b-256 b2624d94d5e3e3f7b472e22753393f34297c7b0625d2e8aa9731336900fe7195

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page