Skip to main content

Record browser sessions and reverse-engineer them into scraping scripts.

Project description

AutomatiQ

Your activity, into automation.

Discord

AutomatiQ

Alpha (v0.1.0) => Work in progress. Things will break, change, and improve.

Record a browser session, and an AI agent reverse-engineers it into a standalone Python script.

Install

pip install automatiq

Install from source

git clone https://github.com/StoneSteel27/AutomatiQ.git
cd AutomatiQ
pip install -e .

Dev setup

pip install -e ".[dev]"
pre-commit install

This installs ruff, build, twine, and pre-commit hooks (lint + format on every commit).

Configuration

On first run, AutomatiQ creates ~/.automatiq/config.toml with commented defaults. Edit it to override models, timeouts, recording settings, etc.

[models]
agent    = "gemini/gemini-3-flash-preview"
recorder = "gemini/gemini-3.1-flash-lite-preview"
# base_url = "http://localhost:11434/v1"   # Ollama / LM Studio / vLLM

[agent]
max_steps       = 60
sandbox_timeout = 60

[recording]
fps                   = 3
segment_pad           = 2
merge_gap_threshold   = 1.5
max_frames_per_prompt = 8

Priority: CLI flag > ~/.automatiq/config.toml > built-in defaults.

Set your API key in a .env file at the project root (any litellm-supported model works):

GEMINI_API_KEY=your-key-here

Run

# Record a session, then have the agent build a scraper
automatiq run https://example.com

# Or run each step separately
automatiq record https://example.com   # just record
automatiq agent                         # build scraper from last recording

CLI flags override config:

automatiq run https://example.com --model openai/gpt-4o --max-steps 80

What it does

  1. Record: Opens Chrome, captures your browsing (video, network requests, user actions).
  2. Agent: An LLM investigator reads the session dump, experiments in a sandboxed IPython environment, and produces a working scraping script.

Requirements

  • Python 3.11+
  • A supported LLM API key (Gemini, OpenAI, OpenRouter, or any OpenAI-compatible endpoint via --base-url)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

automatiq-0.1.0a4.tar.gz (87.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

automatiq-0.1.0a4-py3-none-any.whl (92.0 kB view details)

Uploaded Python 3

File details

Details for the file automatiq-0.1.0a4.tar.gz.

File metadata

  • Download URL: automatiq-0.1.0a4.tar.gz
  • Upload date:
  • Size: 87.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for automatiq-0.1.0a4.tar.gz
Algorithm Hash digest
SHA256 51ccd22662b1eeb3ee293f3d13e1495b37fa4229c09eb12c24b32fdbe5a94cda
MD5 8eb1cbec65791a4379b0b2f1cd1177dc
BLAKE2b-256 6efb29ca24fefe415798f7ef3de4a74266db1f277d7f1c6f50e9dc39a45aa19d

See more details on using hashes here.

File details

Details for the file automatiq-0.1.0a4-py3-none-any.whl.

File metadata

  • Download URL: automatiq-0.1.0a4-py3-none-any.whl
  • Upload date:
  • Size: 92.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for automatiq-0.1.0a4-py3-none-any.whl
Algorithm Hash digest
SHA256 1b6d74d88382909ab775a5f0720b76a0e13207fd61862e15ae8290d08b71b74d
MD5 ae2d316a8dac22caee283c322cbe7a0c
BLAKE2b-256 b7b186800a7a98c2c334f4db8180ee7ab783a9956a7f014992764a967041667b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page