Skip to main content

A multi-platform cron-like job runner in Python

Project description

Pycroner

Pycroner

Pycroner is a lightweight cron style job runner implemented in Python. Jobs are configured via a YAML file and executed by the runner once their cron schedule matches the current time.

While this Python package is easy to install and get started with, I am also actively developing its Rust counterpart croner. I continuously share ideas and improvements between the two, bringing new features from PyCroner into Croner and applying Rust-inspired optimizations to make the Python version faster. I do not plan on dropping support for either project. My goal is to keep them in feature parity and evolving side by side.

Why Pycroner?

This started as a tool I built for myself.

I was working on a system that had to run both on Windows and Linux, and keeping cron jobs in sync with Windows Task Scheduler was a constant headache. I wanted something dead simple, predictable, and cross-platform.

And I didn’t want to write the same job schedules twice in two different formats.

So I built Pycroner.

It runs scheduled jobs from a single YAML file, no matter what OS you’re on. You just write your jobs once and they work everywhere.

Along the way, I added a few extra things:

  • Fanout support — Run the same job multiple times with different args or in parallel.
  • Hot reload — Update your config and it just picks it up live.
  • Hooks — Jobs can run with specific scheduling hooks that are not possible with regular cron patterns, like on_start and on_exit.
  • Multi-schedule configs - You can define multiple schedules per config and they will be merged as one unified set of rules for the job, hook and cron jobs don't get merged, but still work together.

If you're building automation or ETL flows, or just want a sane way to run time-based jobs in a Python project, this might save you from the time and pain I went through managing a project on both Windows and Linux.

Features

  • Parses standard five field cron expressions (minute, hour, day, month, weekday) using a small built in parser.
  • Jobs can optionally be fanned out into multiple processes. Fanout may be an integer (repeat the job N times) or a list of argument strings that will be appended to the base command.
  • Configuration lives in pycroner.yml by default. The exact format is described in pycroner/spec.md.

Installation

From PyPI:

pip install pycroner

Usage

From code

  1. Create a pycroner.yml file describing your jobs. A simple example is shown below.
  2. Run the job runner from a Python script:
from pycroner.runner import Runner

Runner("pycroner.yml").run()

The runner checks schedules every minute and spawns each job as a subprocess when its cron expression matches the current time.

From CLI

You can also invoke the runner directly from the command line using the pycroner command. By default it looks for pycroner.yml in the current directory:

pycroner

Specify an alternative working directory with --at or a specific configuration file with --config:

pycroner --at /path/to/project
pycroner --config custom.yml

Example Configuration

jobs:
  - id: "index_articles"
    schedule: "*/15 * * * *"
    command: "python index.py"
    fanout: 4

  - id: "daily_etl"
    schedule: "0 2 * * *"
    command: "python etl.py"
    fanout:
      - "--source=internal --mode=full"
      - "--source=external --mode=delta"

  - id: "ping"
    schedule: "* * * * *"
    command: "python ping.py"

  - id: "startup"
    schedule: "on_start"
    command: "python startup.py"

  - id: "cleanup"
    schedule: "on_exit"
    command: "python cleanup.py"

  - id: "multi-conf-job"
    schedule: 
      - "on_start"
      - "*/2 * * * *"
      - "*/3 * * * *"
    command: "echo 'Hello usefulness'" 

Jobs run independently, and any output or error handling is left to your commands. For full details see pycroner/spec.md.

If the configuration file changes while the runner is active, it will be reloaded automatically so updates take effect without restarting.

Output from each job is streamed with a colored prefix containing the job id, and if fanned out, the fanout numeric id is attached.

Hooks

Hooks allow specific schedule executions that are not possible with regular cron expressions

List of available hooks:

  • on_start
  • on_exit

More hooks are being considered, and if you find a use case for a new hook you may open a PR or a discussion.

Startup and Shutdown Hooks

Jobs scheduled with on_start run once immediately when the runner boots.

Jobs scheduled with on_exit run once when the process is shutting down. The runner registers handlers for SIGINT and SIGTERM and also uses atexit to ensure shutdown hooks are executed on normal program termination.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycroner-0.1.9.tar.gz (26.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycroner-0.1.9-py3-none-any.whl (23.8 kB view details)

Uploaded Python 3

File details

Details for the file pycroner-0.1.9.tar.gz.

File metadata

  • Download URL: pycroner-0.1.9.tar.gz
  • Upload date:
  • Size: 26.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for pycroner-0.1.9.tar.gz
Algorithm Hash digest
SHA256 02aa9d56c56256001aba8ead857bcace8efccd0bd1a54ad25f7eee5358041753
MD5 71afbd805fca8b7e7cdea8a7873b5241
BLAKE2b-256 cb2f6395b04b9a4517af4ad79c72ce11acb65a5e394e8f23c647cb86bb991f00

See more details on using hashes here.

File details

Details for the file pycroner-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: pycroner-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 23.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for pycroner-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 8e62e52c9a8669afac085dae622ad0812f1480b22cb2d7da7b88add8bd4df77b
MD5 8c3e8203c5e3ab7ce87156a964547152
BLAKE2b-256 726314a73ab7f7d885dda2bb1a2ef53d4f358d660d1d5cff4bb5a9a511a5d7a1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page