Skip to main content

Terminus CLI: An autonomous AI agent for terminal-based task execution

Project description

Terminus

Terminus is an agent for terminal-based task execution. It integrates with Harbor for agent evaluation and task execution.

Installation

Prerequisites

  • Python >=3.12
  • tmux (required for terminal session management)
  • Harbor framework installed

Installing tmux

Terminus requires tmux to manage terminal sessions. Install it using your system's package manager:

macOS:

brew install tmux

Ubuntu/Debian:

sudo apt-get install tmux

Fedora:

sudo dnf install tmux

Arch Linux:

sudo pacman -S tmux

Install Terminus

For local development:

pip install -e /path/to/terminus

Usage

Command Line Interface

Terminus provides a CLI for quick testing and demonstration:

# Basic usage
terminus "Create a file hello.txt with Hello World"

# With options
terminus "Create a file hello.txt" \
  --model openai/gpt-4o \
  --logs-dir ./logs \
  --parser json \
  --temperature 0.7

# Show help
terminus --help

Note:

  • The CLI runs directly on your local system using tmux (no Docker required)
  • For testing and simple tasks, the CLI is quick and convenient
  • For production use cases and complex evaluations, integration with Harbor is recommended (see below)

Usage with Harbor

Terminus is designed to work as an external agent with Harbor. You can use it in several ways:

Option 1: Using import path (Recommended)

When configuring your Harbor task, use the import path to load the Terminus agent:

agent:
  import_path: "terminus:Terminus"
  model_name: "anthropic/claude-sonnet-4"
  kwargs:
    parser_name: "json"  # or "xml"
    temperature: 0.7
    max_turns: 100
    enable_summarize: true

Or in Python:

from harbor import AgentFactory
from pathlib import Path

agent = AgentFactory.create_agent_from_import_path(
    import_path="terminus:Terminus",
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",
    temperature=0.7,
)

Option 2: Direct instantiation

from terminus import Terminus
from pathlib import Path

agent = Terminus(
    logs_dir=Path("./logs"),
    model_name="anthropic/claude-sonnet-4",
    parser_name="json",  # or "xml"
    temperature=0.7,
    max_turns=100,
    enable_summarize=True,
)

Configuration Options

  • model_name: The LLM model to use (required)
  • parser_name: Response format - "json" or "xml" (default: "json")
  • temperature: Sampling temperature (default: 0.7)
  • max_turns: Maximum number of agent turns (default: 1000000)
  • enable_summarize: Enable context summarization when limits are reached (default: True)
  • api_base: Custom API base URL (optional)
  • collect_rollout_details: Collect detailed token-level rollout data (default: False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

terminus_ai-0.1.0.tar.gz (33.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

terminus_ai-0.1.0-py3-none-any.whl (38.1 kB view details)

Uploaded Python 3

File details

Details for the file terminus_ai-0.1.0.tar.gz.

File metadata

  • Download URL: terminus_ai-0.1.0.tar.gz
  • Upload date:
  • Size: 33.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.8

File hashes

Hashes for terminus_ai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 3cddb4bf03d6194363823b256ac29f48d4842d34e86ef0a94b9c1a245519025f
MD5 2fad55918ebbbd1f59b6fc4fbaa4b225
BLAKE2b-256 482806683ac125378f2ae4ad550690d2a71ca26a2d1cd8531ce4b9556e9d7618

See more details on using hashes here.

File details

Details for the file terminus_ai-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for terminus_ai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 14a2c908c083f28095ddcb904161572b127205eebf60ab3343a837b7c1e2c028
MD5 66ae084fe3e72c8c9f94d723baede880
BLAKE2b-256 3b7ecb43218bef0d024804c9a102f24e445c6238334ed7883d0ad26847a48e8d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page