Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus

Terminus is an agent for terminal-based task execution. It integrates with Harbor for agent evaluation and task execution.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)
  • Harbor framework installed

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

For local development:

pip install -e /path/to/terminus

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • For testing and simple tasks, the CLI is quick and convenient
  • For production use cases and complex evaluations, integration with Harbor is recommended (see below)

Usage with Harbor

Terminus is designed to work as an external agent with Harbor. You can use it in several ways:

Option 1: Using import path (Recommended)

When configuring your Harbor task, use the import path to load the Terminus agent:

agent:
  import_path: "terminus:Terminus"
  model_name: "anthropic/claude-sonnet-4"
  kwargs:
    parser_name: "json"  # or "xml"
    temperature: 0.7
    max_turns: 100
    enable_summarize: true

Or in Python:

from harbor import AgentFactory
from pathlib import Path

agent = AgentFactory.create_agent_from_import_path(
    import_path="terminus:Terminus",
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",
    temperature=0.7,
)

Option 2: Direct instantiation

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-1.0.0.tar.gz (33.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-1.0.0-py3-none-any.whl (38.1 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-1.0.0.tar.gz.

File metadata

  • Download URL: terminus_ai-1.0.0.tar.gz
  • Upload date:
  • Size: 33.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 f7430a6ec739a1846090d7aa9fa44d8a3b6450981b4f301ab2766f9f845bfa75
MD5 b0df0d6516e6da3d6d5d87eb8e81c165
BLAKE2b-256 b8d34971540217ea8166e14abeab2e15227f8fe7858abc80582c70a85de6d004

See more details on using hashes here.

File details

Details for the file terminus_ai-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 16b8d2c38de9c8f4d380efc12b6f317248ba2c405c7166c43eeae616f5df3f8c
MD5 67d08862eeebbadfbcbd706393db0764
BLAKE2b-256 6de68feec26f7e80ea535a283ae700f12689ecd69e455868b499337d327f1c7d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page