Python SDK for CMDOP agent interaction
Project description
cmdop
Any machine. One API.
from cmdop import CMDOPClient
with CMDOPClient.remote(api_key="cmd_xxx") as server:
server.terminal.execute("docker restart app")
server.files.write("/etc/nginx/nginx.conf", new_config)
logs = server.files.read("/var/log/app.log")
No SSH. No VPN. No open ports.
How
Your Code ──── Cloud Relay ──── Agent (on server)
│
Outbound only, works through any NAT/firewall
Agent connects out. Your code connects to relay. Done.
Install
pip install cmdop
from cmdop import CMDOPClient, AsyncCMDOPClient
# Remote (via cloud relay)
with CMDOPClient.remote(api_key="cmd_xxx") as client:
client.files.list("/home")
# Local (direct IPC)
with CMDOPClient.local() as client:
client.terminal.execute("ls -la")
# Async
async with AsyncCMDOPClient.remote(api_key="cmd_xxx") as client:
await client.files.read("/etc/hostname")
Part 1: Remote Control
Terminal
session = server.terminal.create()
server.terminal.send_input(session.session_id, "kubectl get pods\n")
output = server.terminal.get_history(session.session_id)
| Method | Description |
|---|---|
create(shell) |
Start session |
send_input(id, data) |
Send commands |
get_history(id) |
Get output |
resize(id, cols, rows) |
Resize |
send_signal(id, signal) |
SIGINT/SIGTERM |
close(id) |
End session |
Files
server.files.list("/var/log")
server.files.read("/etc/nginx/nginx.conf")
server.files.write("/tmp/config.json", b'{"key": "value"}')
| Method | Description |
|---|---|
list(path) |
List dir |
read(path) |
Read file |
write(path, content) |
Write file |
delete(path) |
Delete |
copy(src, dst) |
Copy |
move(src, dst) |
Move |
mkdir(path) |
Create dir |
info(path) |
Metadata |
Agent
from pydantic import BaseModel
class Health(BaseModel):
status: str
cpu: float
issues: list[str]
result = server.agent.run("Check server health", output_schema=Health)
health: Health = result.output # Typed!
| Method | Description |
|---|---|
run(prompt, output_schema) |
Run agent, get typed result |
Types: chat, terminal, command, router, planner
Real World Examples
Deploy with typed result:
class DeployResult(BaseModel):
success: bool
version: str
errors: list[str]
result = server.agent.run(
"Deploy myapp:v2.1, verify containers healthy",
output_schema=DeployResult
)
if not result.output.success:
rollback(result.output.errors)
Fleet update (1000 devices):
async def update_fleet(keys: list[str], config: bytes):
async with asyncio.TaskGroup() as tg:
for key in keys:
tg.create_task(update_one(key, config))
async def update_one(key: str, config: bytes):
async with AsyncCMDOPClient.remote(api_key=key) as dev:
await dev.files.write("/etc/app/config.yml", config)
await dev.terminal.execute("systemctl restart app")
Debug customer machine:
with CMDOPClient.remote(api_key=customer_key) as m:
m.terminal.send_input(sid, "ps aux\n")
logs = m.files.read("~/Library/Logs/MyApp/error.log")
m.terminal.send_input(sid, "df -h\n")
Part 2: Web Parsing
Browser
DOM extraction:
with server.browser.create_session() as b:
b.navigate("https://shop.com/products")
products = b.extract_data(
".product-card",
'{"name": "h2", "price": ".price", "url": {"selector": "a", "attr": "href"}}',
limit=100
)["items"]
# → [{"name": "iPhone", "price": "$999", "url": "/p/123"}, ...]
JS fetch injection (bypass CORS):
with client.browser.create_session() as b:
b.navigate("https://site.com") # Get cookies/session
# Single API call
data = b.fetch_json("https://api.site.com/v1/items")
# Parallel fetch with headers and credentials
results = b.fetch_all(
urls={
"users": "https://api.site.com/v1/users",
"orders": "https://api.site.com/v1/orders",
},
headers={"Accept": "application/json"},
credentials=True, # Include cookies
)
# → {"users": {"data": [...], "error": None}, "orders": {"data": [...], "error": None}}
Custom JS execution:
with client.browser.create_session() as b:
b.navigate("https://site.com")
# Execute async JS with auto-wrap and JSON parsing
result = b.execute_js("""
const resp = await fetch('/api/data');
return await resp.json();
""")
# → {"items": [...]}
# Raw mode (returns JSON string)
raw = b.execute_js("return document.title", raw=True)
| Method | Description |
|---|---|
create_session(headless) |
Start browser |
navigate(url) |
Go to URL |
click(selector) |
Click |
type(selector, text) |
Type |
wait_for(selector, timeout_ms) |
Wait |
extract(selector, attr) |
Get text/attr |
extract_regex(pattern) |
Regex matches |
validate_selectors(item, fields) |
Check selectors |
extract_data(item, fields, limit) |
Bulk extract → list[dict] |
fetch_json(url) |
JS fetch → dict |
fetch_all(urls, headers, credentials) |
Parallel fetch → {id: {data, error}} |
execute_js(code, raw) |
Async JS with auto-wrap → dict |
execute_script(js) |
Raw JS → str |
screenshot() |
PNG |
get_cookies() / set_cookies() |
Cookies |
SDKBaseModel
Auto-cleaning Pydantic model for scraped data. No more manual .strip(), regex, URL joining.
from cmdop import SDKBaseModel
class Product(SDKBaseModel):
__base_url__ = "https://shop.com"
name: str = "" # " iPhone 15 \n" → "iPhone 15"
price: int = 0 # "$1,299.00" → 1299
rating: float = 0 # "4.5 stars" → 4.5
url: str = "" # "/p/123" → "https://shop.com/p/123"
# Batch parse with auto dedupe + filter
products = Product.from_list(raw["items"])
| Type | Input | Output |
|---|---|---|
str |
" text \n\t " |
"text" |
int |
"$27,471" |
27471 |
float |
"4.5 out of 5" |
4.5 |
str (url field) |
"/path" |
"https://base.com/path" |
Parsing Examples
Scrape with validation:
with client.browser.create_session() as b:
b.navigate("https://cars.com/listings")
# 1. Validate (fail fast if site changed)
v = b.validate_selectors(".item", {"title": "h2", "price": ".price"})
if not v["valid"]:
raise Exception(v["errors"])
# 2. Extract
cars = b.extract_data(".item", '{"title": "h2", "price": ".price"}', limit=200)["items"]
Scrape with SDKBaseModel:
class Product(SDKBaseModel):
__base_url__ = "https://amazon.com"
title: str = ""
price: int = 0 # "$1,299" → 1299
url: str = "" # "/dp/..." → "https://amazon.com/dp/..."
with client.browser.create_session(headless=True) as b:
b.navigate("https://amazon.com/s?k=laptop")
raw = b.extract_data(".s-result-item", '{"title": "h2", "price": ".a-price-whole", "url": {"selector": "a", "attr": "href"}}', limit=50)
products = Product.from_list(raw["items"]) # clean + dedupe + filter
Parallel API fetching:
with client.browser.create_session() as b:
b.navigate("https://api.example.com")
# Fetch 10 pages in parallel
urls = {f"page_{i}": f"https://api.example.com/items?page={i}" for i in range(10)}
results = b.fetch_all(urls)
items = []
for key, res in results.items():
if res["data"]: # {"data": {...}, "error": None}
items.extend(res["data"].get("items", []))
elif res["error"]:
print(f"{key} failed: {res['error']}")
JS fetch for protected APIs:
with client.browser.create_session() as b:
b.navigate("https://site.com") # Get session cookies
# Fetch JSON API (inherits cookies)
data = b.fetch_json("https://api.site.com/v1/data")
# Parallel fetch with custom headers
results = b.fetch_all(
urls={
"inspection": f"https://api.site.com/v1/car/{car_id}/inspection",
"options": f"https://api.site.com/v1/car/{car_id}/options",
},
headers={"Accept": "application/json", "Origin": "https://site.com"},
credentials=True,
)
inspection = results["inspection"]["data"]
options = results["options"]["data"]
Part 3: Utilities
Logging
Rich-powered logger with automatic project root detection and file persistence.
from cmdop import get_logger
log = get_logger(__name__)
log.info("Starting process")
log.debug("Details: %s", data)
log.warning("Something seems off")
log.error("Failed!", exc_info=True) # With traceback
Features:
- Rich console output with colors and timestamps
- Auto-saves to
logs/folder in project root - Finds project root by
pyproject.toml,requirements.txt,.git - Daily log rotation by filename
Custom settings:
from cmdop import get_logger, setup_logging
# Set log level
log = get_logger(__name__, level="DEBUG")
# Custom app name for log files
log = get_logger(__name__, app_name="myparser")
# → logs/myparser_2024-01-17.log
# Disable file logging
log = get_logger(__name__, log_to_file=False)
Log file format:
2024-01-17 14:30:25 | INFO | myapp.parser:42 - Starting process
2024-01-17 14:30:26 | ERROR | myapp.parser:58 - Failed to fetch
| Function | Description |
|---|---|
get_logger(name, level, app_name) |
Get configured logger |
setup_logging(level, log_to_file) |
Configure root logger |
find_project_root() |
Find Python project root |
get_log_dir(app_name) |
Get/create logs directory |
TOON Format (Token Optimization)
Convert JSON to TOON format — saves 30-50% tokens for LLM consumption.
from cmdop import json_to_toon, JsonCleaner
# Simple conversion
data = {"name": "Alice", "age": 25, "city": "Seoul"}
toon = json_to_toon(data)
# → "name: Alice\nage: 25\ncity: Seoul"
# With cleaning (removes nulls, empty values, noise keys)
cleaner = JsonCleaner(noise_keys={"internal_id", "metadata"})
clean_toon = cleaner.to_toon({
"name": "BMW X5",
"price": 50000,
"internal_id": "abc123", # removed
"specs": None, # removed
"options": [], # removed
})
# → "name: BMW X5\nprice: 50000"
JsonCleaner features:
- Removes
Nonevalues and empty[],{},"" - Simplifies
{"code": "1", "title": "Good"}→"Good" - Custom noise keys per domain
| Function | Description |
|---|---|
json_to_toon(data) |
Convert dict/list to TOON string |
JsonCleaner(noise_keys) |
Cleaner with custom noise keys |
cleaner.compact(data) |
Clean without converting |
cleaner.to_toon(data) |
Clean + convert to TOON |
Security
- TLS everywhere
- Outbound only — no open ports
- API key scoping
- Audit logs
Requirements
- Python 3.10+
- CMDOP agent on target
Links
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cmdop-0.1.16.tar.gz.
File metadata
- Download URL: cmdop-0.1.16.tar.gz
- Upload date:
- Size: 150.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f210697048b1434795a33acebceb44e835e7c0fdf0f87150fef0f4ac6bcf3d5c
|
|
| MD5 |
757603e2fdcfa92e1fe9d4805f6d58c9
|
|
| BLAKE2b-256 |
eb8745527be082b1ffeeb87a21eb267863e20e6ed10ade43932d8df469414901
|
File details
Details for the file cmdop-0.1.16-py3-none-any.whl.
File metadata
- Download URL: cmdop-0.1.16-py3-none-any.whl
- Upload date:
- Size: 258.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
078093ed5e9c8ec2d5a2b8fed0f2613d8c9fbce3c85639782d4658eb0a41f62d
|
|
| MD5 |
5b69bfeb8bc81afc90f21847a7fcfcc9
|
|
| BLAKE2b-256 |
6d9127cef29ef79469e702358b20d61513175dccd724e6984d71dd5f6440f540
|