Skip to main content

Radio Astronomy AI Assistant

Project description

Astro

Beta Release — Astro is currently in active development. Core functionality is stable, but the API may evolve before the 1.0 release.

Astro is a chat-first CLI and Python library for interacting with AI agents powered by multiple LLM backends (OpenAI, Anthropic, Ollama). It provides an interactive terminal interface with rich formatting, model selection, and custom tool integration capabilities.

Installation

Astro requires Python >=3.11 and is distributed via PyPI. We recommend using uv for installation and dependency management, but any packaging tool should be usable.

As a Command-Line Tool

Install Astro globally as a CLI application.

Using uv:

uv tool install astro

Using pip:

pip install astro

Using pipx:

pipx install astro

Once installed, launch the interactive shell:

astro-cli

As a Python Library

Add Astro to your project dependencies.

Using uv with pyproject.toml:

uv add astro

Using uv without pyproject.toml (bare .venv):

uv pip install astro

Using pip:

pip install astro

Environment Configuration

Astro loads LLM API keys from multiple sources (in order of precedence):

  1. Local .env file — Place a .env file in your working directory:

    OPENAI_API_KEY=sk-...
    ANTHROPIC_API_KEY=sk-ant-...
    
  2. User secrets directory — Store keys in $HOME/.astro/.secrets:

    mkdir -p ~/.astro # Or Astro will generate one
    echo "OPENAI_API_KEY=sk-..." >> ~/.astro/.secrets
    
  3. Shell environment variables — Export keys in your shell configuration (e.g., .bashrc, .zshrc) or manually in the terminal:

    export OPENAI_API_KEY="sk-..."
    export ANTHROPIC_API_KEY="sk-ant-..."
    

Ollama Setup

For local models via Ollama, ensure the Ollama service is running:

# Install Ollama (see https://ollama.ai)
ollama serve

# Check instance status
ollama ps

Astro will automatically detect local Ollama models, but it will only use what has been manually pulled. To use a specific model, use:

ollama pull llama3.1:latest # Or some other model

See ollama.com/library for different models you can pull.

Usage

Astro offers three primary usage patterns depending on your needs:

1. Interactive CLI

Launch the interactive shell after installation:

# If installed as a tool
astro-cli

# Or run directly with uv
uv run astro-cli

Inside the shell, you can:

  • Chat with AI models using natural language
  • Switch models with /model
  • View available commands with /help
  • Exit with /quit or Ctrl+C

NOTE: Hashtag commands, e.g. #history, exist but are not implemented yet. At they moment, they do nothing.

2. Custom Tools Integration

Use run_astro_with to inject your own Python functions as agent tools with a chat stream:

from astro import run_astro_with

def calculate_orbit(period_days: float, semi_major_axis_au: float) -> dict:
    """Calculate orbital parameters for a celestial body."""
    # Your implementation here
    return {"period": period_days, "axis": semi_major_axis_au}

def list_telescopes() -> list[str]:
    """Return available telescope identifiers."""
    return ["VLT", "ALMA", "JWST"]

# Launch interactive CLI with custom tools
run_astro_with(
    items=[calculate_orbit, list_telescopes],
    instructions="You are an astronomy assistant with access to orbital calculations."
)

See examples/custom_cli for an example.

3. Direct Agent API

Build custom applications using the agent primitives directly:

from astro.agents.chat import create_astro_stream
from astro.agents.base import create_agent
from astro.contexts import ChatContext

# Create a streaming chat agent
stream_fn, message_history = create_astro_stream(
    identifier="openai:gpt-4o",
    tools=None,  # or pass your tools
    instructions="Custom system instructions here"
)

# Stream responses
async def chat():
    async for output in stream_fn("What is the diameter of Mars?"):
        # Handle different output types (text, tool calls, etc.)
        print(output)

# Or create a custom agent from scratch
agent = create_agent(
    identifier="ollama:llama3.1:latest",
    context_type=ChatContext,
    tools=[your_tools],
    agent_name="my_agent"
)

Examples

The examples/ directory contains working demonstrations:

Custom Tools Demo

Clone the repository and run the observatory tools example:

git clone https://github.com/kwazzi-jack/astro.git
cd astro
uv sync
uv run python examples/custom_tools/run_custom_cli.py

This demo shows how to:

  • Define custom tool functions
  • Register observatories with site metadata
  • Inject tools into the Astro CLI
  • Use natural language to query domain-specific data

Try these prompts in the demo CLI:

  1. "List available observatories and their specialties"
  2. "Describe the summit-array site"
  3. "Schedule a 40-minute observation of Vega with two exposures"

Beta Limitations

This is an early beta release focused on core functionality:

  • ✅ Multi-model LLM support (OpenAI, Anthropic, Ollama)
  • ✅ Interactive CLI with rich formatting
  • ✅ Custom tool integration
  • ✅ Direct agent API access
  • 🚧 Full API documentation (TODO)
  • 🚧 Comprehensive test coverage (planned)
  • 🚧 CI/CD automation (planned)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ratt_astro-0.1.0b1.tar.gz (91.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ratt_astro-0.1.0b1-py3-none-any.whl (81.9 kB view details)

Uploaded Python 3

File details

Details for the file ratt_astro-0.1.0b1.tar.gz.

File metadata

  • Download URL: ratt_astro-0.1.0b1.tar.gz
  • Upload date:
  • Size: 91.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ratt_astro-0.1.0b1.tar.gz
Algorithm Hash digest
SHA256 6b76952fe81c1b54d780f35d20714472f4a458e82b8ecfea96bf97620eec33fa
MD5 162c13c3c0354d6bebffd829e39e3147
BLAKE2b-256 c293d1278b1004666387a136f9d04a2207151e0f9e2342a5beb70882bf5beed5

See more details on using hashes here.

File details

Details for the file ratt_astro-0.1.0b1-py3-none-any.whl.

File metadata

  • Download URL: ratt_astro-0.1.0b1-py3-none-any.whl
  • Upload date:
  • Size: 81.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for ratt_astro-0.1.0b1-py3-none-any.whl
Algorithm Hash digest
SHA256 520f0ad86467e32b878672820f8ab2f98b273110776e5abf2c7166b814539e64
MD5 6aba06a0cb32cf5ea535f726dadbd614
BLAKE2b-256 babde0dfd84a4170be7af02f5f0c69e4d93a82c3c42d94882e40c165e7c13d4a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page