Skip to main content

Python SDK for CMDOP agent interaction

Project description

cmdop

Any machine. One API.

from cmdop import CMDOPClient

with CMDOPClient.remote(api_key="cmd_xxx") as server:
    server.terminal.execute("docker restart app")
    server.files.write("/etc/nginx/nginx.conf", new_config)
    logs = server.files.read("/var/log/app.log")

No SSH. No VPN. No open ports.


How

Your Code ──── Cloud Relay ──── Agent (on server)
                    │
        Outbound only, works through any NAT/firewall

Agent connects out. Your code connects to relay. Done.


Install

pip install cmdop
from cmdop import CMDOPClient, AsyncCMDOPClient

# Remote (via cloud relay)
with CMDOPClient.remote(api_key="cmd_xxx") as client:
    client.files.list("/home")

# Local (direct IPC)
with CMDOPClient.local() as client:
    client.terminal.execute("ls -la")

# Async
async with AsyncCMDOPClient.remote(api_key="cmd_xxx") as client:
    await client.files.read("/etc/hostname")

Terminal

session = server.terminal.create()
server.terminal.send_input(session.session_id, "kubectl get pods\n")
output = server.terminal.get_history(session.session_id)
Method Description
create(shell) Start session
send_input(id, data) Send commands
get_history(id) Get output
resize(id, cols, rows) Resize
send_signal(id, signal) SIGINT/SIGTERM
close(id) End session

Files

server.files.list("/var/log")
server.files.read("/etc/nginx/nginx.conf")
server.files.write("/tmp/config.json", b'{"key": "value"}')
Method Description
list(path) List dir
read(path) Read file
write(path, content) Write file
delete(path) Delete
copy/move(src, dst) Copy/Move
mkdir(path) Create dir
info(path) Metadata

Agent

from pydantic import BaseModel

class Health(BaseModel):
    status: str
    cpu: float
    issues: list[str]

result = server.agent.run("Check server health", output_schema=Health)
health: Health = result.output  # Typed!

Browser

with client.browser.create_session() as b:
    b.navigate("https://shop.com/products")

    # DOM extraction
    products = b.extract_data(".product-card", '{"name": "h2", "price": ".price"}', limit=100)

    # Get HTML for BeautifulSoup parsing
    html = b.get_html("[role='feed']")
    soup = b.parse_html(html)  # Returns BeautifulSoup object

    # Scroll & extract pattern
    for _ in range(10):
        html = b.get_html(".listings")
        # ... parse with soup ...
        b.scroll("down", 800)
        b.wait_seconds(1.0)

    # JS fetch (bypass CORS, inherit cookies)
    data = b.fetch_json("https://api.site.com/v1/items")

    # Parallel fetch
    results = b.fetch_all({
        "users": "https://api.site.com/v1/users",
        "orders": "https://api.site.com/v1/orders",
    }, credentials=True)
Method Description
navigate(url) Go to URL
click(selector) Click element
type(selector, text) Type text
wait_for(selector, ms) Wait for element
wait_seconds(n) Sleep
extract(selector, attr) Get text/attr
get_html(selector) Get HTML
get_text(selector) Get text
parse_html(html) → BeautifulSoup
extract_data(item, fields, limit) Bulk extract
fetch_json(url) JS fetch → dict
fetch_all(urls, credentials) Parallel fetch
execute_js(code) Run async JS
screenshot() PNG bytes
scroll(dir, amount) Scroll page
scroll_to(selector) Scroll to element
get_scroll_info() Position + page size
infinite_scroll(fn, limit) Smart scroll loop
hover(selector) Hover
select(selector, value) Dropdown select
close_modal() Close dialogs
get/set_cookies() Cookie management

SDKBaseModel

Auto-cleaning Pydantic model for scraped data:

from cmdop import SDKBaseModel

class Product(SDKBaseModel):
    __base_url__ = "https://shop.com"
    name: str = ""    # "  iPhone 15  \n" → "iPhone 15"
    price: int = 0    # "$1,299.00" → 1299
    rating: float = 0 # "4.5 stars" → 4.5
    url: str = ""     # "/p/123" → "https://shop.com/p/123"

products = Product.from_list(raw["items"])  # Auto dedupe + filter

Utilities

Logging:

from cmdop import get_logger
log = get_logger(__name__)
log.info("Starting")  # Rich console + auto file logging

TOON Format (30-50% token savings):

from cmdop import json_to_toon, JsonCleaner
toon = json_to_toon({"name": "Alice", "age": 25})
# → "name: Alice\nage: 25"

Requirements

  • Python 3.10+
  • CMDOP agent on target

Links

cmdop.com

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cmdop-0.1.17.tar.gz (155.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cmdop-0.1.17-py3-none-any.whl (263.4 kB view details)

Uploaded Python 3

File details

Details for the file cmdop-0.1.17.tar.gz.

File metadata

  • Download URL: cmdop-0.1.17.tar.gz
  • Upload date:
  • Size: 155.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.17.tar.gz
Algorithm Hash digest
SHA256 f8ec1421ae1c0f399de2deef52ccc3ff980ed6fe465fbab9040405588035359c
MD5 13cc94e5eea5218fe92f8092079e84b7
BLAKE2b-256 8d4f3845059b295bd3a31ed7bb4af18257c8293f836623b2aa5f5638fff626b3

See more details on using hashes here.

File details

Details for the file cmdop-0.1.17-py3-none-any.whl.

File metadata

  • Download URL: cmdop-0.1.17-py3-none-any.whl
  • Upload date:
  • Size: 263.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.17-py3-none-any.whl
Algorithm Hash digest
SHA256 4f741f3c1628b29ea8b25c9dc6dce2edac6b0e22d303ea239d772da684b7ee3f
MD5 04255de81e7b21b6eaa4cc4de0e9fda7
BLAKE2b-256 5b007f894acc26995325cf3109ad995569ed23d4eadfa2e2947ab50053580802

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page