Skip to main content

ChatATP CLI - Terminal Interface for ChatATP API

Project description

ChatATP CLI

 ██████╗██╗  ██╗ █████╗ ████████╗ █████╗ ████████╗██████╗ 
██╔════╝██║  ██║██╔══██╗╚══██╔══╝██╔══██╗╚══██╔══╝██╔══██╗
██║     ███████║███████║   ██║   ███████║   ██║   ██████╔╝
██║     ██╔══██║██╔══██║   ██║   ██╔══██║   ██║   ██╔═══╝ 
╚██████╗██║  ██║██║  ██║   ██║   ██║  ██║   ██║   ██║     
 ╚═════╝╚═╝  ╚═╝╚═╝  ╚═╝   ╚═╝   ╚═╝  ╚═╝   ╚═╝   ╚═╝     

A powerful terminal interface for the ChatATP API, built with Python. Interact with ChatATP's AI models, manage chatrooms, toolkits, integrations, and more directly from your command line.

✨ What's New (v1.1.1)

🤖 Agent Mode Support

  • Device Agent: CLI can now act as an agent on your local device
  • MCP Tool Execution: Automatically discovers and executes local MCP tools
  • Device Tool Calls: AI can request local tool execution via execution_type='device'
  • Seamless Integration: Tool results flow back to AI for continued conversation

🚀 Agentic Loop Support

  • Interactive Conversations: Back-and-forth chat with persistent context
  • Real-time Streaming: Responses appear as they're generated with tool call visualization
  • Auto-enter Chatrooms: chat new now creates AND enters chat automatically
  • In-chat Commands: Rich command system during conversations (/exit, /help, /clear, /history)

🔄 Chat Evolution

  • Before: One-shot messages that exit immediately
  • Now: Persistent interactive sessions with agentic conversations
  • Streaming: Tool calls, thinking blocks, and responses all visualized in real-time

Features

  • Account Management: View your ChatATP account information
  • Model Management: List and manage AI models
  • Interactive Chat: Agentic loop support with back-and-forth conversations
  • Toolkit Management: Browse and manage your toolkits and collections
  • Integration Management: Manage OAuth and custom integrations
  • AI Configuration: Configure AI providers, models, and settings
  • Media Management: Browse and manage uploaded media files
  • Store Access: Browse featured, popular, and recommended toolkits
  • MCP Support: Manage MCP servers and connections

Installation

From PyPI (Recommended)

pip install chatatp-cli

From Source

  1. Clone or download this repository

  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Make the script executable (optional):

    chmod +x main.py
    

Quick Start

When you run the CLI, you'll see the beautiful ASCII banner:

 ██████╗██╗  ██╗ █████╗ ████████╗ █████╗ ████████╗██████╗ 
██╔════╝██║  ██║██╔══██╗╚══██╔══╝██╔══██╗╚══██╔══╝██╔══██╗
██║     ███████║███████║   ██║   ███████║   ██║   ██████╔╝
██║     ██╔══██║██╔══██║   ██║   ██╔══██║   ██║   ██╔═══╝ 
╚██████╗██║  ██║██║  ██║   ██║   ██║  ██║   ██║   ██║     
 ╚═════╝╚═╝  ╚═╝╚═╝  ╚═╝   ╚═╝   ╚═╝  ╚═╝   ╚═╝   ╚═╝     

Then get help with:

chatatp --help
# or
python main.py --help

## Configuration

Before using the CLI, you need to configure your API token:

```bash
chatatp config set-token YOUR_API_TOKEN_HERE

You can also configure other settings:

# Set custom API base URL (default: https://api.chat-atp.com)
chatatp config set-base-url https://your-custom-api-url.com

# Set default model
chatatp config set-default-model gpt-oss-120b

# View current configuration
chatatp config show

🔔 Desktop Notifications

The ChatATP CLI now supports cross-platform desktop notifications with sound alerts! Get notified when AI starts responding, without keeping your terminal window focused.

✅ Cross-Platform Support

Works on any operating system:

  • Windows: Native toast notifications + system sounds
  • macOS: Native notifications + system sounds
  • Linux: System tray notifications + audio playback
  • Fallback: System bell on unsupported systems

🔊 Notification Triggers

  • 🤖 AI Response Start: Desktop notification when AI begins typing/responding
  • 🔧 Tool Completion: Optional notifications for tool execution results
  • ❌ Error Alerts: Sound notifications for connection issues

⚙️ Configuration

# Enable desktop notifications
chatatp config enable-notifications

# Disable desktop notifications  
chatatp config disable-notifications

# Enable sound alerts (with notifications)
chatatp config enable-sound

# Disable sound alerts
chatatp config disable-sound

📱 Notification Experience

When chatting interactively:

$ chatatp chat new "Tell me about AI"

Chatroom created: abc123
Entered chatroom: abc123
Type your message or '/exit' to quit, '/help' for commands

You: Tell me about AI

# 🔔 Desktop notification appears: "ChatATP is thinking..."
# 🔊 Sound plays (if enabled)
# 💻 Terminal shows response as it streams

ChatATP ·

AI, or Artificial Intelligence, refers to the simulation of human intelligence...

🔇 Privacy & Control

  • Opt-in by default: Notifications disabled until you enable them
  • Sound optional: Enable notifications without sound, or both together
  • Non-intrusive: Only triggers on AI responses, not every message
  • Cross-platform: Same experience regardless of your OS

🛠️ Technical Details

  • Uses plyer library for cross-platform compatibility
  • Sound support via system audio (Windows/macOS/Linux)
  • Fallback to terminal bell if audio unavailable
  • No external dependencies for basic notifications
  • Configuration persists across sessions

Usage

Getting Help

chatatp --help

Account Information

chatatp account

Models

# List available models
chatatp models

# List models for a specific provider
chatatp ai provider-models PROVIDER_ID

🎯 Interactive Chat System

The ChatATP CLI now supports true agentic conversations with persistent context, just like chatting with ChatGPT or Claude!

New Interactive Commands

Start New Interactive Chat

chatatp chat new "Tell me about machine learning"
  • Creates a new chatroom
  • Automatically enters interactive mode
  • Sends your initial message
  • Starts back-and-forth conversation loop

Enter Existing Chatroom

chatatp chat converse ROOM_ID
  • Enter any existing chatroom for interactive chat
  • Continue conversations where you left off
  • Same interactive experience as new chats

In-Chat Commands

While in interactive mode, you have access to these commands:

  • /exit, /quit, /q - Exit the current chat session
  • /help, /h - Show available commands
  • /clear - Clear the screen and show current chat info
  • /history - Show recent messages from chat history

Interactive Chat Flow

$ chatatp chat new "Hello, let's discuss AI"

Chatroom created: abc123
Entered chatroom: abc123
Type your message or '/exit' to quit, '/help' for commands

You: Hello, let's discuss AI

ChatATP ·

Hello! I'd be happy to discuss AI with you. What specific aspects of artificial intelligence are you interested in? Whether it's machine learning algorithms, current trends, ethical considerations, or practical applications, I'm here to help!

─────────────────────────────────
You: Tell me about neural networks

ChatATP ·

Neural networks are fascinating! Let me explain them step by step:

A neural network is a computational model inspired by the human brain's neural structure. It consists of interconnected nodes (neurons) organized in layers...

─────────────────────────────────
You: /help

Available commands:
  /exit, /quit, /q  - Exit the chat
  /help, /h         - Show this help
  /clear             - Clear the screen
  /history           - Show chat history

You: /exit
Exiting chat...

Legacy Commands (Still Supported)

One-shot Message Sending

# Send single message (legacy - exits immediately)
chatatp chat send ROOM_ID "Your message here"

# Send with specific model and toolkits
chatatp chat send ROOM_ID "Analyze this data" --model gpt-oss-120b --toolkits TOOLKIT_ID1 TOOLKIT_ID2

# Debug mode to see raw chunks
chatatp chat send ROOM_ID "Debug message" --debug

Chatroom Management

# List your chatrooms
chatatp chat rooms

# Show details of a specific chatroom
chatatp chat show ROOM_ID

Toolkits

# List your toolkits
chatatp toolkits

# Browse featured toolkits
chatatp store featured

# Browse popular toolkits
chatatp store popular

# Browse recommended toolkits
chatatp store recommended

Integrations

# List OAuth integrations
chatatp integrations list

# List custom integrations
chatatp integrations custom

MCP Management

# List MCP connections
chatatp mcp connections

# List MCP servers
chatatp mcp servers

🔧 Local MCP Client

ChatATP CLI now includes a built-in MCP (Model Context Protocol) client that can directly connect to local MCP servers configured in standard config files. This allows you to use MCP tools, resources, and prompts directly from your terminal without going through the ChatATP API.

📁 Supported Configuration Files

The MCP client automatically discovers and loads configurations from:

  • ~/mcp.json - Standard MCP configuration
  • ~/mcp_config.json - Alternative config format
  • ~/.chatatp/mcp.json - ChatATP-specific config (auto-created)
  • ~/Library/Application Support/Claude/claude_desktop_config.json - macOS Claude config
  • ~/AppData/Roaming/Claude/claude_desktop_config.json - Windows Claude config
  • ~/.cursor/mcp.json - Cursor editor config
  • ~/.gemini/antigravity/mcp_server.json - Gemini config

📋 Configuration Format

Create an mcp.json file in your home directory:

{
    "mcpServers": {
        "context7": {
          "command": "npx",
          "args": ["-y", "@upstash/context7-mcp@latest"]
        },
        "tavily-search": {
          "command": "npx",
          "args": ["-y", "tavily-mcp@0.1.2"],
          "env": {
            "TAVILY_API_KEY": "your-api-key-here"
          }
        },
        "my-custom-server": {
          "command": "python",
          "args": ["-m", "my_mcp_server"],
          "env": {
            "CUSTOM_ENV_VAR": "value"
          }
        }
    }
}

🚀 Local MCP Commands

List Configured Servers

chatatp mcp local-servers

Connect to a Server

chatatp mcp local-connect context7

Shows server capabilities, version info, and available features.

List Available Tools

chatatp mcp local-tools context7

Call a Tool

# Call a tool with JSON arguments
chatatp mcp local-call-tool context7 search_web --args '{"query": "latest AI news"}'

# Call without arguments
chatatp mcp local-call-tool context7 get_weather --args '{}'

List Resources

chatatp mcp local-resources context7

Read a Resource

chatatp mcp local-read-resource context7 file:///path/to/resource

List Prompts

chatatp mcp local-prompts context7

💡 Example Usage

# First, create a config file
echo '{
        "mcpServers": {
              "context7": {
                "command": "npx",
                "args": ["-y", "@upstash/context7-mcp@latest"]
              }
        }
      }' > ~/mcp.json

# List configured servers
$ chatatp mcp local-servers

Local MCP Servers (1 configured)
┏━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Name        Command     Transport   URL/Args                     ┃
├────────────┼────────────┼────────────┼─────────────────────────────┤
┃ context7    npx         STDIO       -y @upstash/context7-mcp@latest ┃
└────────────┴────────────┴────────────┴─────────────────────────────┘

# Connect and explore
$ chatatp mcp local-connect context7

Connected to context7
─────────────────────
Server: Context7 MCP Server
Version: 1.0.1
Title: Context7
Capabilities:
   tools:    resources:    prompts:    experimental:    completions:    streaming: # List tools
$ chatatp mcp local-tools context7

Tools on context7
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Name                           Title                           Description                   ┃
├───────────────────────────────┼────────────────────────────────┼───────────────────────────────┤
┃ search_web                     Web Search                      Search the web for information ┃
┃ get_page_content               Get Page Content               Get content from a web page   ┃
└───────────────────────────────┴────────────────────────────────┴───────────────────────────────┘

# Call a tool
$ chatatp mcp local-call-tool context7 search_web --args '{"query": "latest AI developments"}'

Tool executed successfully
Content:
# AI Developments 2024

Recent advancements in artificial intelligence include...

[Results continue...]

🔧 Transport Types

The MCP client supports all standard MCP transport types:

  • STDIO: Command-line tools that communicate via stdin/stdout
  • HTTP: Web-based MCP servers
  • SSE: Server-sent events for streaming responses

🛡️ Security & Isolation

  • No API Keys Required: Direct local connections to your MCP servers
  • Environment Variables: Securely pass credentials via env config
  • Process Isolation: Each MCP server runs in its own process
  • No Data Leakage: All communication stays local unless using remote servers

🔗 Integration with ChatATP

While the local MCP client works independently, you can also use MCP connections through ChatATP's API for cloud-hosted MCP servers. The local client provides direct access for local development and testing.

🐛 Troubleshooting

Server not found: Ensure your config file exists and is properly formatted Connection failed: Check that the MCP server command is installed and available Tool errors: Verify tool arguments match the expected schema from local-tools

🤖 Agent Mode

ChatATP CLI can now act as a full agent on your local device, automatically discovering and executing MCP tools when requested by AI. This transforms your CLI into an intelligent assistant that can perform actions on your computer.

🎯 What is Agent Mode?

Agent mode enables the CLI to:

  • Discover Local Tools: Automatically scans configured MCP servers for available tools
  • Execute Device Actions: Runs tools locally when AI requests device-based operations
  • Seamless Conversation: Tool results flow back to AI for continued intelligent responses
  • Secure Local Execution: All tool execution happens on your device, not in the cloud

⚙️ Enabling Agent Mode

# Enable agent mode
chatatp agent-mode --enable

# Disable agent mode
chatatp agent-mode --disable

# Check current status
chatatp agent-mode

🔄 How Agent Mode Works

  1. Tool Discovery: When agent mode is enabled, CLI collects schemas from all configured MCP servers
  2. API Integration: Tool schemas are sent to ChatATP API in device_tools payload
  3. AI Requests: AI can request tool execution with execution_type='device'
  4. Local Execution: CLI executes the tool locally on your device
  5. Result Relay: Execution results are sent back to AI for continued conversation

📋 Example Agent Conversation

# Enable agent mode
$ chatatp agent-mode --enable
Agent mode ENABLED

# Start a conversation where AI can use local tools
$ chatatp chat new "List the files in my current directory and analyze what you see"

Chatroom created: abc123
Entered chatroom: abc123

You: List the files in my current directory and analyze what you see

ChatATP ·

I need to see what files are in your current directory. Let me use the filesystem tool to list them.

✓ Device tool executed: filesystem.list_directory
   done

Based on the files I can see in your directory, you have a Python project with:
- README.md: Project documentation
- requirements.txt: Python dependencies
- chatatp_cli/: Main package directory
- pyproject.toml: Project configuration

This appears to be the ChatATP CLI project itself! The structure suggests...

─────────────────────────────────
You: Can you read the README and tell me what's new?

🛠️ Supported Device Tools

Agent mode works with any MCP-compatible tools, including:

  • Filesystem Tools: Read, write, list, and manage files and directories
  • Web Tools: Browser automation, web scraping, search capabilities
  • Development Tools: Code analysis, documentation lookup, repository management
  • System Tools: Process management, system information, configuration
  • Custom Tools: Any MCP server you configure locally

🔐 Security & Privacy

  • Local Execution Only: Tools run on your device, not sent to external servers
  • No Data Leakage: Conversations and tool results stay local until you share them
  • Configurable Access: MCP servers define their own access controls
  • Process Isolation: Each MCP server runs in its own secure process

⚠️ Important Notes

  • Agent mode is disabled by default for security
  • Requires MCP server configuration (see Local MCP Client section above)
  • Tool execution may take time depending on the operation
  • Results are sent back to AI for continued intelligent responses
  • All execution happens locally on your device

🔧 Configuration Requirements

Before using agent mode, ensure you have:

  1. MCP servers configured in one of the supported config files
  2. Agent mode enabled via chatatp agent-mode --enable
  3. Valid API token for ChatATP authentication

📊 Status Indicators

When agent mode is active, you'll see:

  • [green]✓[/green] for successful tool execution
  • [red]✗[/red] for failed tool execution
  • [dim]Device tool executed: server_name.tool_name[/dim] status messages

🚀 Advanced Usage

# Combine with specific toolkits for enhanced capabilities
chatatp chat new "Analyze my project structure" --toolkits code-analysis-toolkit

# Use with custom MCP servers for specialized tasks
# Configure your MCP server in ~/mcp.json, then:
chatatp chat new "Run my custom analysis tool on this data"

🌐 MCP Proxy Server

The CLI includes a built-in HTTP/HTTPS proxy server that exposes your local MCP servers as REST API endpoints. This allows external clients (like ChatATP Proxy MCP Client) to interact with your local MCP servers via HTTP/HTTPS requests, while the actual tool execution happens securely on your device.

🚀 One-Time Setup (Auto-Start Mode)

Enable automatic proxy startup with your ngrok token once:

chatatp config enable-proxy-auto-start --ngrok-token NGROK_AUTH_TOKEN

That's it! The proxy server will now start automatically every time you run any CLI command, providing continuous public HTTPS access to your MCP servers.

🎮 Manual Control

# Start manually (traditional mode)
chatatp mcp proxy --ngrok --ngrok-token YOUR_TOKEN

# HTTPS mode with self-signed certificate
chatatp mcp proxy --https --ngrok --ngrok-token YOUR_TOKEN

# HTTPS with custom certificates
chatatp mcp proxy --https --cert-file /path/to/cert.pem --key-file /path/to/key.pem --ngrok --ngrok-token YOUR_TOKEN

# Check proxy status and configuration
chatatp config proxy-status

⚙️ Auto-Start Configuration

Enable Auto-Start:

chatatp config enable-proxy-auto-start --ngrok-token YOUR_TOKEN --host 127.0.0.1 --port 8001

Disable Auto-Start:

chatatp config disable-proxy-auto-start

Check Status:

chatatp config proxy-status

� Hybrid Mode (Smart Routing)

Hybrid Mode enables intelligent routing between direct proxy calls and traditional polling, automatically registering your proxy with the ChatATP API for optimal performance.

How It Works

  • Automatic Registration: When the proxy starts with ngrok, it registers with ChatATP API
  • Smart Routing: ChatATP backend can now route tool calls directly through your proxy (~200ms)
  • Graceful Fallback: If proxy is unavailable, automatically falls back to polling mode
  • Zero Configuration: Works automatically once auto-start is enabled

🚀 Performance Benefits

✅ Proxy Available: Agent → POST proxy_url/tools/filesystem → ~200ms ⚡
❌ Proxy Down:      Agent → stream tool_call → CLI executes → ~2-5s 📡

🔧 Registration Process

  • On Startup: POST /api/v1/mcp/device/register-proxy/ with proxy URL and server list
  • On Shutdown: DELETE /api/v1/mcp/device/unregister-proxy/ for cleanup
  • Error Handling: Registration failures don't break the proxy - it still works locally

🎯 Use Cases

  • Continuous Availability: Remote clients get fast direct access when proxy is running
  • Reliable Fallback: System automatically degrades gracefully when proxy is down
  • Mixed Environments: Some clients use proxy, others use polling - both work seamlessly
  • Development: Test hybrid routing without manual proxy management

📡 API Endpoints

Once running, the proxy server provides the following REST endpoints:

Server Management

  • GET / - Server information and available endpoints (includes public URL if using ngrok)
  • GET /servers - List all configured MCP servers
  • GET /servers/{server_name} - Get detailed info about a specific server

Tool Operations

  • GET /servers/{server_name}/tools - List tools available on a server
  • POST /tools/{server_name}/{tool_name} - Call a tool on a server
// POST /tools/tavily-mcp/tavily-search
{
  "arguments": {
    "query": "latest AI news",
    "search_depth": "advanced"
  }
}

Resource Operations

  • GET /servers/{server_name}/resources - List resources on a server
  • GET /resources/{server_name}/{uri} - Read a specific resource

Prompt Operations

  • GET /servers/{server_name}/prompts - List prompts on a server

🔒 Security & HTTPS

Self-Signed Certificates:

  • When using --https without custom certificates, the server automatically generates a self-signed SSL certificate
  • The certificate is valid for localhost and 127.0.0.1
  • Browsers will show security warnings for self-signed certificates

Custom Certificates:

  • Use --cert-file and --key-file to provide your own SSL certificates
  • Required for production deployments

Remote Access:

  • Use --ngrok to create a secure public tunnel
  • Requires ngrok authentication token (--ngrok-token)
  • Provides HTTPS URL automatically
  • Perfect for sharing with external services

🎯 Use Cases

  • Remote API Access: Allow ChatATP cloud services to execute tools on your local machine
  • Secure Tool Sharing: Share filesystem tools with remote clients over HTTPS
  • Development Testing: Test MCP integrations without direct protocol knowledge
  • Production Deployments: Run proxy server with custom SSL certificates
  • Continuous Availability: Auto-start mode keeps your MCP servers always accessible

⚙️ Configuration & Setup

ngrok Setup

  1. Install ngrok: pip install pyngrok
  2. Sign up at ngrok.com and get your auth token
  3. Use the token with --ngrok-token YOUR_TOKEN

Custom SSL Certificates

# Generate certificates (example)
openssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodes

# Use with proxy server
chatatp mcp proxy --https --cert-file cert.pem --key-file key.pem

MCP Server Configuration

The proxy server respects the same MCP server configurations as the CLI. Make sure your mcp.json or other config files are properly set up:

{
    "mcpServers": {
        "tavily-mcp": {
          "command": "npx",
          "args": ["-y", "@modelcontextprotocol/server-tavily"],
          "env": {
            "TAVILY_API_KEY": "your-api-key"
          }
        },
        "filesystem-mcp": {
          "command": "npx", 
          "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed/directory"],
          "env": {}
        }
    }
}

🌍 Remote Access Example

# One-time setup for continuous access
chatatp config enable-proxy-auto-start --ngrok-token YOUR_NGROK_TOKEN

# Now every CLI command auto-starts the proxy
chatatp models
# → MCP proxy server starts automatically in background
# → Public URL: https://your-unique-url.ngrok-free.dev

# Remote clients can now continuously access your MCP servers:
# https://your-unique-url.ngrok-free.dev/tools/context7/search_web
# https://your-unique-url.ngrok-free.dev/tools/filesystem/read_text_file
# https://your-unique-url.ngrok-free.dev/tools/tavily-mcp/tavily-search

The proxy server will automatically discover and expose all configured MCP servers, providing secure remote access to your local MCP ecosystem! Remote MCP clients can now access your local tools (context7 documentation, tavily web search, firecrawl scraping, filesystem operations) via public HTTPS endpoints with zero manual intervention.

AI Management

# List AI providers
chatatp ai providers

# List AI configurations
chatatp ai configs

# Show AI settings
chatatp ai settings

Media

# List your media files
chatatp media

# Search media
chatatp media --search "document name"

# Filter by type
chatatp media --type image

# Pagination
chatatp media --page 2 --page-size 20

Pricing

# View pricing plans
chatatp pricing

Authentication

All commands require authentication. Make sure you've set your API token using:

chatatp config set-token YOUR_TOKEN

The token will be stored securely in your home directory under ~/.chatatp/config.yaml.

Advanced Features

Real-time Streaming with Tool Visualization

The CLI now provides rich visualization of:

  • Tool Calls: See when tools are being executed with timing
  • Thinking Blocks: Visual indicators when AI is reasoning
  • Progress Indicators: Spinners and status updates during processing
  • Error Handling: Graceful handling of network issues and API errors

Example with Tool Calls

You: Search for information about OpenClaw AI

ChatATP ·

I need to search for information about OpenClaw AI. Let me use the search tool.

  ┌ search_web · web_search_toolkit
  └ done 0.34s

Based on my search, OpenClaw is...

Interactive vs One-shot Mode Comparison

Feature Interactive Mode (chat new/converse) One-shot Mode (chat send)
Persistence ✅ Back-and-forth conversation ❌ Single message only
Context ✅ Full conversation history ❌ No context retention
Commands ✅ Rich in-chat commands ❌ None
Tool Visualization ✅ Real-time with progress ✅ Basic streaming
Exit Behavior Manual exit with /exit Auto-exit after response
Use Case Deep conversations, exploration Quick questions, automation

Error Handling

The CLI provides clear error messages for common issues:

  • Missing API token
  • Invalid room IDs
  • Network errors
  • Authentication failures
  • Streaming interruptions

Dependencies

  • requests: HTTP client
  • click: Command line interface
  • rich: Beautiful terminal output with progress indicators
  • pyyaml: Configuration file handling
  • python-dotenv: Environment variable support

Migration Guide

From v1.x to v2.0

Old workflow:

# Create room
chatatp chat new "Hello"
# Copy room ID
# Send messages one by one
chatatp chat send ROOM_ID "Follow up question"
chatatp chat send ROOM_ID "Another question"

New workflow:

# Single command starts interactive session
chatatp chat new "Hello, let's have a conversation"
# Now chat back and forth naturally!
# Type /exit when done

Legacy commands still work: All v1.x commands (chat send, etc.) remain fully functional for scripts and automation.

Contributing

Feel free to submit issues and pull requests to improve the ChatATP CLI.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatatp_cli-1.1.1.tar.gz (51.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatatp_cli-1.1.1-py3-none-any.whl (35.5 kB view details)

Uploaded Python 3

File details

Details for the file chatatp_cli-1.1.1.tar.gz.

File metadata

  • Download URL: chatatp_cli-1.1.1.tar.gz
  • Upload date:
  • Size: 51.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for chatatp_cli-1.1.1.tar.gz
Algorithm Hash digest
SHA256 ff82154700db8b879d71278e751b83c3e1e5cd00c6355fe6c22d6e6260b18668
MD5 bcbed689d96739454c85e1363e010929
BLAKE2b-256 484ec8419af87a68fe468cf43735de3dce375433c4e8b3e3d4cf0bccf3dc11b0

See more details on using hashes here.

File details

Details for the file chatatp_cli-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: chatatp_cli-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 35.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for chatatp_cli-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8ccab0c4fadc35f8dcab3feeabf54312011138a978dc50ccb6452836e80c681c
MD5 2d9c89f8ddd9df19d5ba79174171a461
BLAKE2b-256 05e6cbaded09fb2ceb5fd6b8f94eaf3f451c71605477165af2339d534d56671f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page