Skip to main content

Python SDK for CMDOP agent interaction

Project description

cmdop

Any machine. One API.

from cmdop import CMDOPClient

with CMDOPClient.remote(api_key="cmd_xxx") as server:
    server.terminal.execute("docker restart app")
    server.files.write("/etc/nginx/nginx.conf", new_config)
    logs = server.files.read("/var/log/app.log")

No SSH. No VPN. No open ports.


How

Your Code ──── Cloud Relay ──── Agent (on server)
                    │
        Outbound only, works through any NAT/firewall

Agent connects out. Your code connects to relay. Done.


Install

pip install cmdop
from cmdop import CMDOPClient, AsyncCMDOPClient

# Remote (via cloud relay)
with CMDOPClient.remote(api_key="cmd_xxx") as client:
    client.files.list("/home")

# Local (direct IPC)
with CMDOPClient.local() as client:
    client.terminal.execute("ls -la")

# Async
async with AsyncCMDOPClient.remote(api_key="cmd_xxx") as client:
    await client.files.read("/etc/hostname")

Terminal

session = server.terminal.create()
server.terminal.send_input(session.session_id, "kubectl get pods\n")
output = server.terminal.get_history(session.session_id)
Method Description
create(shell) Start session
send_input(id, data) Send commands
get_history(id) Get output
resize(id, cols, rows) Resize
send_signal(id, signal) SIGINT/SIGTERM
close(id) End session

Files

server.files.list("/var/log")
server.files.read("/etc/nginx/nginx.conf")
server.files.write("/tmp/config.json", b'{"key": "value"}')
Method Description
list(path) List dir
read(path) Read file
write(path, content) Write file
delete(path) Delete
copy/move(src, dst) Copy/Move
mkdir(path) Create dir
info(path) Metadata

Agent

from pydantic import BaseModel

class Health(BaseModel):
    status: str
    cpu: float
    issues: list[str]

result = server.agent.run("Check server health", output_schema=Health)
health: Health = result.output  # Typed!

Browser

Capability-based API for browser automation.

from cmdop.services.browser.models import WaitUntil

with client.browser.create_session() as s:
    s.navigate("https://shop.com/products", wait_until=WaitUntil.NETWORKIDLE)
    s.dom.close_modal()  # Close popups

    # BeautifulSoup parsing
    soup = s.dom.soup()  # SoupWrapper with chainable API
    for item in soup.select(".product"):
        title = item.select_one("h2").text()
        price = item.attr("data-price")

    # Scrolling with random delays
    for _ in range(10):
        soup = s.dom.soup(".listings")
        s.scroll.js("down", 700)
        s.timing.random(0.8, 1.5)

    # Click with cursor movement
    s.click("button.buy", move_cursor=True)

    # Click all "See more" buttons
    s.input.click_all("See more")

    # Mouse operations
    s.input.mouse_move(500, 300)
    s.input.hover(".tooltip-trigger")

    # JS fetch (bypass CORS, inherit cookies)
    data = s.fetch.json("/api/items")

Core Methods (on session)

Method Description
navigate(url, wait_until) Go to URL (wait_until: LOAD, DOMCONTENTLOADED, NETWORKIDLE, COMMIT)
click(selector, move_cursor) Click element
type(selector, text) Type text
wait_for(selector) Wait for element
execute_script(js) Run JavaScript
screenshot() PNG bytes
get_state() URL + title
get_page_info() Full page info
get/set_cookies() Cookie management

Capabilities

session.scroll - Scrolling

Method Description
js(dir, amount) JS scroll (works on complex sites)
native(dir, amount) Browser API scroll
to_bottom() Scroll to page bottom
to_element(selector) Scroll element into view
info() Get scroll position
infinite(extract_fn) Smart infinite scroll with extraction

session.input - Input operations

Method Description
click_js(selector) JS click (reliable)
click_all(text, role) Click all matching elements
key(key, selector) Press keyboard key
hover(selector) Hover over element (native)
hover_js(selector) Hover via JS
mouse_move(x, y) Move cursor to coordinates

session.timing - Delays

Method Description
wait(ms) Wait milliseconds
seconds(n) Wait seconds
random(min, max) Random delay
timeout(fn, sec, cleanup) Run with timeout

session.dom - DOM operations

Method Description
html(selector) Get HTML
text(selector) Get text content
soup(selector) → SoupWrapper
parse(html) → BeautifulSoup
extract(selector, attr) Get text/attr list
select(selector, value) Dropdown select
close_modal() Close dialogs

session.fetch - HTTP from browser context

Method Description
json(url) Fetch JSON
all(requests) Parallel fetch
execute(js_code) Custom JS fetch code

session.network - Network capture (v2.19.0)

Method Description
enable(max_exchanges) Start capturing HTTP traffic
disable() Stop capturing
get_all() Get all captured exchanges
filter(url_pattern, methods, status_codes) Filter exchanges
last(url_pattern) Get most recent matching exchange
api_calls(url_pattern) Get XHR/Fetch calls matching pattern
last_json(url_pattern) Get JSON body from last matching response
wait_for(url_pattern, timeout_ms) Wait for matching request
stats() Capture statistics
export_har() Export to HAR format
clear() Clear captured data
# Example: Intercept API responses
from cmdop.services.browser.models import WaitUntil

with client.browser.create_session() as s:
    s.network.enable()
    s.navigate("https://app.example.com", wait_until=WaitUntil.NETWORKIDLE)

    # Get last API response
    api = s.network.last("/api/data")
    data = api.json_body()

    # Filter by criteria
    posts = s.network.filter(
        url_pattern="/api/posts",
        methods=["GET"],
        status_codes=[200],
    )

    s.network.disable()

SDKBaseModel

Auto-cleaning Pydantic model for scraped data:

from cmdop import SDKBaseModel

class Product(SDKBaseModel):
    __base_url__ = "https://shop.com"
    name: str = ""    # "  iPhone 15  \n" → "iPhone 15"
    price: int = 0    # "$1,299.00" → 1299
    rating: float = 0 # "4.5 stars" → 4.5
    url: str = ""     # "/p/123" → "https://shop.com/p/123"

products = Product.from_list(raw["items"])  # Auto dedupe + filter

Utilities

Logging:

from cmdop import get_logger
log = get_logger(__name__)
log.info("Starting")  # Rich console + auto file logging

TOON Format (30-50% token savings):

from cmdop import json_to_toon, JsonCleaner
toon = json_to_toon({"name": "Alice", "age": 25})
# → "name: Alice\nage: 25"

Requirements

  • Python 3.10+
  • CMDOP agent on target

Links

cmdop.com

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cmdop-0.1.23.tar.gz (166.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cmdop-0.1.23-py3-none-any.whl (275.5 kB view details)

Uploaded Python 3

File details

Details for the file cmdop-0.1.23.tar.gz.

File metadata

  • Download URL: cmdop-0.1.23.tar.gz
  • Upload date:
  • Size: 166.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.23.tar.gz
Algorithm Hash digest
SHA256 dcd0d1a1d76660a0cee9b45a065505f7642727a380b6ce3a3e78754aa45b4132
MD5 80c3bf5529758ea7380652790f8a247a
BLAKE2b-256 64842be630927638fadd2650f28d2f42266acd1794f300ce5e4d5fa0b827dd39

See more details on using hashes here.

File details

Details for the file cmdop-0.1.23-py3-none-any.whl.

File metadata

  • Download URL: cmdop-0.1.23-py3-none-any.whl
  • Upload date:
  • Size: 275.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.23-py3-none-any.whl
Algorithm Hash digest
SHA256 faf37a5bd96af36c7a73e972702312b14eedac156b6cdc6ce80a38efcc0032dc
MD5 93045486b348c50cadc8f08e630f4657
BLAKE2b-256 67e257fdc4197f5b2089b2e62210f31975b02eda16f092da8f7b8c1b80f81c6c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page