Skip to main content

An automation framework for python using OpenCV

Project description

Pixeler

Publish Python Package

A Python framework for building game automation bots that can see, understand, and interact with Windows applications in a human-like way.


How it works

Capture screenshots  →  Train a model  →  Run a GameBot  →  Write game modules
      (CLI)               (CLI/API)        (event loop)        (plugin API)
  1. Capture — point the pixeler-capture tool at your game window and draw bounding boxes around the objects you care about (enemies, health bars, loot, UI elements). Frames are saved as a labeled dataset.
  2. Train — run pixeler-train to fine-tune a YOLOv8 nano model on your dataset and export it as an ONNX file.
  3. Detect — load the ONNX into a GameBot alongside color filters, templates, and OCR regions. Every frame, the ScreenAnalyzer runs all detectors and fires events.
  4. React — write a GameModule that subscribes to events ("detection.enemy", "color.health_low", "ocr.health_bar") and calls mouse/keyboard actions in response.

Installation

pip install pixeler

Training support (YOLOv8 via Ultralytics) is an optional extra — skip it if you only need inference at runtime:

# Runtime only
pip install pixeler

# With training support
pip install "pixeler[train]"

Additional requirement: Tesseract OCR must be installed separately and its directory added to your system PATH before OCR functions will work.


Quickstart

Step 1 — Capture a dataset

pixeler-capture --game "MyGame" --dataset datasets/mygame --classes enemy,loot,health_bar

An OpenCV annotation window opens on top of a frozen game screenshot.

Key Action
Click + drag Draw a bounding box
09 Assign class to the last box
s Save this frame
Space Grab a new frame (auto-saves)
c Clear boxes
q Quit

Step 2 — Train

pixeler-train --dataset datasets/mygame --out models/mygame.onnx --epochs 100

Exports the dataset into a YOLO-compatible structure, trains yolov8n, and writes the final model to models/mygame.onnx. Training artifacts (charts, confusion matrices, checkpoints) land in training_runs/mygame/.

Step 3 — Write a bot

# my_game_bot.py
from pathlib import Path
from src.pixeler.bot.game_bot import GameBot          # coming in phase 4
from src.pixeler.window.win32_window import Win32Window
from src.pixeler.vision.classifier import YOLOClassifier
from src.pixeler.vision.color import ColorFilter
from src.pixeler.math.rectangle import Rectangle
from my_game.combat_module import CombatModule

CLASSES = ["enemy", "loot", "health_bar"]

bot = GameBot(window=Win32Window("MyGame"))
bot.analyzer.add_yolo("detector", YOLOClassifier("models/mygame.onnx", CLASSES))
bot.analyzer.add_color("health_low", ColorFilter.from_rgb(220, 30, 30, hue_tol=10))
bot.analyzer.add_ocr("gold", Rectangle(10, 50, 120, 18), throttle_s=1.0)
bot.add_module(CombatModule())
bot.start()

Step 4 — Write a game module

# my_game/combat_module.py
from src.pixeler.modules.base_module import GameModule   # coming in phase 5
from src.pixeler.events.payloads import DetectionPayload, ColorRegionPayload

class CombatModule(GameModule):
    name = "combat"

    def on_register(self, bus, bot):
        self._bot = bot
        self.on("detection.enemy", self._attack)
        self.on("color.health_low", self._eat_food)

    def _attack(self, event):
        payload: DetectionPayload = event.data
        self.log(f"Enemy at {payload.center}")
        from src.pixeler.input.mouse import move_and_right_click
        move_and_right_click(*payload.center)

    def _eat_food(self, event):
        from src.pixeler.input.keyboard import press
        press("1")

Feature Overview

Vision

Module What it does
vision/color.py ColorFilter — HSV-range pixel detection (illumination-independent). Color — BGR solid colors for drawing.
vision/detection.py find_color_regions(), find_template(), find_all_templates() — results include .center, .rect, .confidence.
vision/classifier.py YOLOClassifier — ONNX multi-class detection via cv2.dnn. ORBMatcher — feature-based sprite matching, no training needed.
vision/ocr.py read_text(), read_number(), read_words(). Always call with preprocess=True on game screenshots.
vision/utils.py preprocess_for_ocr() — upscale → CLAHE → threshold → denoise.

Training

Module What it does
training/dataset.py Dataset — manages labeled screenshot storage in YOLO format. BoundingBox — normalized coordinates with pixel ↔ YOLO conversion.
training/capturer.py CaptureSession — interactive OpenCV annotation UI against a live game window.
training/trainer.py YOLOTrainer — trains YOLOv8, exports to ONNX (requires [train] extra). ORBTrainer — sprite library for ORBMatcher.

Event System

Module What it does
events/event_bus.py EventBus — synchronous pub/sub. Wildcard subscriptions ("detection.*"), fault-isolated handlers, catch-all ("*").
events/event_names.py String constants and builder functions: event_names.detection("enemy")"detection.enemy".
events/payloads.py Typed payloads for every event: DetectionPayload, ColorRegionPayload, OCRPayload, etc. All have .center, .rect shortcuts.

Input

Module What it does
input/mouse.py move_to() with WindMouse algorithm. click(), right_click(), scroll(), move_and_click().
input/keyboard.py write(text, wpm=65) with human timing. press(), hotkey(). Optional mistakes=True for realistic typos.

Window

Module What it does
window/win32_window.py Win32Window — Windows HWND targeting. create_overlay() factory.
window/overlay.py Overlay — transparent always-on-top GDI drawing layer. draw_rect(), draw_circle(), draw_text(), draw_label().
window/window.py Window — cross-platform targeting via pywinctl.

Math & Geometry

Module What it does
math/point.py Point(x, y) — immutable. distance_to(), lerp(), normalize(), dot().
math/rectangle.py Rectangle(x, y, w, h). random_point() biased toward center. screenshot().
math/bezier.py natural_path(start, end) — human-like curved mouse paths.
math/circle.py Circlerandom_point_normal() for human-like aiming.
math/polygon.py Polygon — ray-casting containment, random_point() via rejection sampling.
math/random.py reaction_delay(), idle_delay(), gaussian_jitter() — human timing distributions.

Development Setup

git clone https://github.com/klobbix/Pixeler
cd Pixeler

# Install runtime dependencies
uv sync

# Install with training support
uv sync --extra train

# Run examples
uv run python examples/example_bot.py

License

MIT — see LICENSE.

Acknowledgments

Contact

klobbix@gmail.com · GitHub Issues

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pixeler-0.0.6.tar.gz (6.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pixeler-0.0.6-py3-none-any.whl (74.9 kB view details)

Uploaded Python 3

File details

Details for the file pixeler-0.0.6.tar.gz.

File metadata

  • Download URL: pixeler-0.0.6.tar.gz
  • Upload date:
  • Size: 6.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pixeler-0.0.6.tar.gz
Algorithm Hash digest
SHA256 945ccbfe01ac0abe74b20b33081ff65b19fd88b99ee1d028fa3777ce6b6dd8d3
MD5 31b57886ac453f3a13f04924eb1630bb
BLAKE2b-256 33021acfa5c68bf340628e4d8a1e0f892cc3137a859792c1814cb55cda85143e

See more details on using hashes here.

Provenance

The following attestation bundles were made for pixeler-0.0.6.tar.gz:

Publisher: python-publish.yml on Klobbix/Pixeler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pixeler-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: pixeler-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 74.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pixeler-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 043ff38f1fff3d740f5942dadc9090731b71f28b64b3336b5fb85d948f8446bf
MD5 d91d91f864760d20e8b5aa3a2beb970e
BLAKE2b-256 1f874b7520875d68b33effcb5cf11dd02b5ef5643534f02f0aab8390c91feb80

See more details on using hashes here.

Provenance

The following attestation bundles were made for pixeler-0.0.6-py3-none-any.whl:

Publisher: python-publish.yml on Klobbix/Pixeler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page