Skip to main content

Python SDK for CMDOP agent interaction

Project description

cmdop

Any machine. One API.

from cmdop import CMDOPClient

with CMDOPClient.remote(api_key="cmd_xxx") as server:
    server.terminal.execute("docker restart app")
    server.files.write("/etc/nginx/nginx.conf", new_config)
    logs = server.files.read("/var/log/app.log")

No SSH. No VPN. No open ports.


How

Your Code ──── Cloud Relay ──── Agent (on server)
                    │
        Outbound only, works through any NAT/firewall

Agent connects out. Your code connects to relay. Done.


Install

pip install cmdop
from cmdop import CMDOPClient, AsyncCMDOPClient

# Remote (via cloud relay)
with CMDOPClient.remote(api_key="cmd_xxx") as client:
    client.files.list("/home")

# Local (direct IPC)
with CMDOPClient.local() as client:
    client.terminal.execute("ls -la")

# Async
async with AsyncCMDOPClient.remote(api_key="cmd_xxx") as client:
    await client.files.read("/etc/hostname")

Terminal

session = server.terminal.create()
server.terminal.send_input(session.session_id, "kubectl get pods\n")
output = server.terminal.get_history(session.session_id)
Method Description
create(shell) Start session
send_input(id, data) Send commands
get_history(id) Get output
resize(id, cols, rows) Resize
send_signal(id, signal) SIGINT/SIGTERM
close(id) End session

Files

server.files.list("/var/log")
server.files.read("/etc/nginx/nginx.conf")
server.files.write("/tmp/config.json", b'{"key": "value"}')
Method Description
list(path) List dir
read(path) Read file
write(path, content) Write file
delete(path) Delete
copy/move(src, dst) Copy/Move
mkdir(path) Create dir
info(path) Metadata

Agent

from pydantic import BaseModel

class Health(BaseModel):
    status: str
    cpu: float
    issues: list[str]

result = server.agent.run("Check server health", output_schema=Health)
health: Health = result.output  # Typed!

Browser

with client.browser.create_session() as b:
    b.navigate("https://shop.com/products")
    b.close_modal()  # Close popups

    # BeautifulSoup parsing
    soup = b.soup()  # SoupWrapper with chainable API
    for item in soup.select(".product"):
        title = item.select_one("h2").text()
        price = item.attr("data-price")

    # Human-like scrolling with random delays
    for _ in range(10):
        soup = b.soup(".listings")
        # ... parse ...
        b.scroll("down", 700, human_like=True)  # Natural micro-scrolls
        b.wait_random(0.8, 1.5)  # Random delay

    # Scroll inside container (Facebook, Twitter feeds)
    b.scroll("down", 800, container="[role='feed']")

    # Click all "See more" buttons
    b.click_all_by_text("See more")

    # JS fetch (bypass CORS, inherit cookies)
    data = b.fetch_json("https://api.site.com/v1/items")
Method Description
navigate(url) Go to URL
click(selector) Click element
click_all_by_text(text, role) Click all matching elements
type(selector, text) Type text
wait_for(selector, ms) Wait for element
wait_seconds(n) Sleep
wait_random(min, max) Random sleep
extract(selector, attr) Get text/attr
get_html(selector) Get HTML
soup(selector) → SoupWrapper
parse_html(html) → BeautifulSoup
fetch_json(url) JS fetch → dict
fetch_all(urls) Parallel fetch
execute_js(code) Run async JS
screenshot() PNG bytes
scroll(dir, amount, ...) Scroll page/container
scroll_to(selector) Scroll to element
get_scroll_info() Position + page size
hover(selector) Hover
select(selector, value) Dropdown select
close_modal() Close dialogs
get/set_cookies() Cookie management

scroll() parameters:

  • direction: "up", "down", "left", "right"
  • amount: pixels to scroll
  • smooth: animate scroll (default True)
  • human_like: random micro-scrolls + variation
  • container: CSS selector for scroll container

SDKBaseModel

Auto-cleaning Pydantic model for scraped data:

from cmdop import SDKBaseModel

class Product(SDKBaseModel):
    __base_url__ = "https://shop.com"
    name: str = ""    # "  iPhone 15  \n" → "iPhone 15"
    price: int = 0    # "$1,299.00" → 1299
    rating: float = 0 # "4.5 stars" → 4.5
    url: str = ""     # "/p/123" → "https://shop.com/p/123"

products = Product.from_list(raw["items"])  # Auto dedupe + filter

Utilities

Logging:

from cmdop import get_logger
log = get_logger(__name__)
log.info("Starting")  # Rich console + auto file logging

TOON Format (30-50% token savings):

from cmdop import json_to_toon, JsonCleaner
toon = json_to_toon({"name": "Alice", "age": 25})
# → "name: Alice\nage: 25"

Requirements

  • Python 3.10+
  • CMDOP agent on target

Links

cmdop.com

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cmdop-0.1.18.tar.gz (158.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cmdop-0.1.18-py3-none-any.whl (266.1 kB view details)

Uploaded Python 3

File details

Details for the file cmdop-0.1.18.tar.gz.

File metadata

  • Download URL: cmdop-0.1.18.tar.gz
  • Upload date:
  • Size: 158.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.18.tar.gz
Algorithm Hash digest
SHA256 6fafe5a38f3bf7d77e836f79f9a8455a155815227a6f85d05683e8be63483605
MD5 3e322000ff959c5e117357eaec7eba60
BLAKE2b-256 2d80b6cb4b51c8c010242de62835598bf0c03785127790d6df625a174eec90d8

See more details on using hashes here.

File details

Details for the file cmdop-0.1.18-py3-none-any.whl.

File metadata

  • Download URL: cmdop-0.1.18-py3-none-any.whl
  • Upload date:
  • Size: 266.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for cmdop-0.1.18-py3-none-any.whl
Algorithm Hash digest
SHA256 ac01bc51248d81a2608a8ee5a8f277a223e12271eaaae821b9186e34de59c75f
MD5 214efe1fa9492361e13ef821b1ddff74
BLAKE2b-256 40e61b419d933d398bb17bc16e1c12f61971d02b265470acdc47ef7cf1575f4d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page