Skip to main content

Rust implementation of ansible-core — drop-in replacement with 19x faster startup and parsing

Project description

Migrating from Python ansible-core to ansible-core-rs

ansible-core-rs is a Rust reimplementation of ansible-core. It aims to run existing playbooks, inventories, and vault files without modification while reducing startup time, memory use, and task-execution overhead. This guide describes what is compatible today, where behavior differs, and how to move an existing Ansible project onto the Rust engine.

Design stance

ansible-core-rs is a drop-in replacement for the engine, not for the module ecosystem. The execution engine, playbook loader, inventory, vault, Galaxy client, and template renderer are Rust. Modules remain Python: common modules (command, file, copy, apt, service, …) have native Rust implementations for speed, and everything else runs through the AnsiballZ Python bridge exactly as upstream Ansible does.

This means your collections, roles, and modules — including anything you install from Galaxy — continue to work. Your playbooks do not need to change.

Installation

Source build

git clone <repo>
cd ansible-core-rs
cargo build --release
./target/release/ansible --version

A Python 3.12+ interpreter must be on PATH for modules that fall back to AnsiballZ.

Nix

nix develop     # dev shell with rust, python, openssh, etc.
nix build       # builds the `ansible` binary

PyPI wrapper (planned)

A pip install ansible-core-rs package that bundles the Rust binary is planned for Phase 11 and not yet available.

Command-line differences

Upstream ansible-core ships nine separate binaries (ansible, ansible-playbook, ansible-vault, …). ansible-core-rs ships a single ansible binary with subcommands:

Upstream ansible-core-rs
ansible ansible <pattern>
ansible-playbook ansible playbook
ansible-vault ansible vault
ansible-config ansible config
ansible-inventory ansible inventory
ansible-galaxy ansible galaxy
ansible-doc ansible doc

If your scripts call the legacy names, symlink them:

for name in ansible-playbook ansible-vault ansible-config \
            ansible-inventory ansible-galaxy ansible-doc; do
  ln -s ansible /usr/local/bin/$name
done

A shim that dispatches on argv[0] is on the roadmap; until then, symlinks are the recommended path.

Common flags (-i, -e, -l, --tags, --skip-tags, --start-at-task, --check, --diff, -v, -b/--become, -u, -f/--forks, --vault-password-file) match upstream behavior.

Configuration

ansible.cfg loading uses the same precedence as upstream:

  1. ANSIBLE_CONFIG environment variable
  2. ./ansible.cfg in cwd
  3. ~/.ansible.cfg
  4. /etc/ansible/ansible.cfg

Environment variables with the ANSIBLE_ prefix override config file settings identically.

View the effective config:

ansible config dump
ansible config list
ansible config view

Inventory

  • INI inventories: fully supported, including group children, group vars, host vars, ranges (web[01:05].example.com), and [group:vars] sections.
  • YAML inventories: fully supported.
  • Script/dynamic inventories: supported. Executable files in the inventory path are invoked with --list / --host <name> exactly like upstream.
  • Plugin inventories (e.g. aws_ec2, gcp_compute): not yet bridged through PyO3 — see the "Python plugins" section below.

ansible inventory --list and ansible inventory --graph produce structurally equivalent output to upstream. The JSON key ordering may differ since Rust's HashMap is unordered; if a consumer depends on specific ordering, pipe through jq --sort-keys.

Large inventories benefit from the interned-name representation: every host and group name is stored once as an Arc<str> and shared between the inventory map, membership lists, and reverse lookups.

Playbooks

Playbook YAML loading supports plays, tasks, blocks (with rescue / always), handlers, roles, includes, and imports. pre_tasks, post_tasks, and role dependencies resolve identically to upstream.

Variable precedence

All 22 precedence levels from upstream are implemented. Exhaustive parity testing against the Python implementation is still in progress (Phase 9); if you hit a case where a variable resolves differently, please open an issue with a minimal reproduction.

Tags

--tags, --skip-tags, and the special tags always, never, all, tagged, untagged behave identically. Inheritance from block → task is preserved.

Loops

loop, with_items, loop_control (including label, pause, and index_var), and register aggregation into results[] all work. until / retries / delay work.

Modules

Native Rust modules (no Python spawn)

These run entirely in-process on the control node, skipping the AnsiballZ wrapper. They exist to eliminate per-task Python startup overhead for the most common task types:

  • Core: debug, set_fact, assert, fail, meta
  • Command: command, shell, raw
  • Files: file, copy, stat, template, lineinfile, blockinfile, replace, slurp
  • Network/HTTP: uri, get_url, wait_for_connection (under the http feature)
  • Facts: setup, gather_facts, ping, package_facts
  • Package managers: apt, yum, dnf
  • Services: service, systemd, systemd_service
  • Users/groups: user, group
  • Includes: include_role, import_role, include_tasks, import_tasks

Argument names and behavior follow upstream module documentation. Where this guide is silent, assume upstream semantics.

Everything else

Any module name not in the list above is routed through the AnsiballZ fallback: the module source is packaged into a self-extracting Python script and executed on the target host exactly as upstream does. This means modules shipped in collections — community.general.*, ansible.posix.*, cloud modules, and so on — run unmodified.

The AnsiballZ packager produces wrappers that are byte-compatible with the Python implementation.

Templates

MiniJinja provides the template engine. 74 filters and 15 tests are implemented natively, covering the core Ansible filter set plus frequently used extras (regex, type tests, version, subset/superset, filesystem).

Complex templates that use Jinja2-specific features not in MiniJinja will fall back to a PyO3-hosted Jinja2 environment (planned — see Phase 7 in the migration plan). Until that fallback lands, the following edge cases may render differently:

  • Custom Jinja2 filters defined in plugins (none of the built-ins are missing, but collection-provided filters don't load yet)
  • Obscure extension tags ({% raw %} and the standard control flow tags all work; exotic third-party extensions do not)

If you hit a template that doesn't render, please report it.

Vault

ansible vault encrypt, decrypt, view, edit, rekey, and encrypt-string all work. The binary format is interoperable with upstream — files encrypted by Python ansible can be decrypted by ansible-core-rs and vice versa. AES-256-CTR and the PBKDF2 key derivation match upstream parameters.

Vault secrets use the zeroize crate to wipe key material from memory after use.

Galaxy

ansible galaxy collection install, build, and list work against local mirrors via the filesystem fetcher. The Galaxy v3 API fetcher is behind the network Cargo feature and has been exercised against local test doubles; it has not yet been validated against galaxy.ansible.com in CI (blocked on toolchain availability — see the migration plan).

requirements.yml batch installs, dependency resolution (depth-first with constraint tightening), and FQCN plugin resolution all match upstream.

Python plugins

This is the biggest current gap. PyO3 is wired into the build, but the plugin loader that discovers and executes Python callback, lookup, filter, and inventory plugins is partial:

  • Callback plugins: the default callback produces upstream-equivalent output (TASK banners, PLAY RECAP, color, diff rendering, verbose levels). Third-party callback plugins do not yet load.
  • Lookup plugins: not yet bridged. {{ lookup('file', ...) }} and the other built-in lookups work via native implementations, but custom lookups from collections do not yet execute.
  • Filter/test plugins: the 74 native filters cover the ansible-core built-in set; collection-provided filter plugins do not yet load.
  • Inventory plugins: dynamic inventory scripts work; the plugin interface (used by aws_ec2, etc.) does not yet load.

If your playbooks depend on collection-provided plugins of these types, you'll want to stay on Python ansible-core until Phase 6 completes. Module plugins from collections do work — the gap is only for non-module plugin types.

SSH

OpenSSH via the ssh CLI is the default and only connection plugin today. ControlMaster multiplexing is enabled by default for both ssh and scp/ sftp invocations: the first task per host opens a master connection that subsequent tasks reuse via a derived ControlPath, with ControlPersist=60s keeping it warm across the play. The control socket is closed on disconnect via ssh -O exit. Override the socket location by setting ansible_ssh_common_args if your environment needs a non-default path.

sshpass is used for password authentication when ansible_ssh_pass is set; use key-based auth or the SSH agent for production.

Performance characteristics

Measured on the benchmark suite (run cargo bench -p ansible-core):

Operation ansible-core-rs Notes
Binary startup ~3.3 ms vs. ~300 ms for Python import overhead
Playbook parsing 24–122 µs Depends on playbook size
Template rendering 1.5–4.7 µs Simple to complex expressions
Inventory parsing ~65 µs 1000-host fixture

A full parity comparison against Python ansible-core on identical workloads is pending (requires both runtimes on the same CI host).

Debugging behavioral differences

If a playbook produces different output under ansible-core-rs:

  1. Run with -vvv on both implementations and diff the output.
  2. Check that any custom plugins used are in the "native" or "AnsiballZ" lists above — if they're plugin types not yet bridged, that's the cause.
  3. Confirm that your ansible.cfg is being read (ansible config dump shows the effective values).
  4. For variable precedence issues, dump host vars: ansible inventory --host <name>.
  5. File an issue with the playbook excerpt, relevant inventory, and both -vvv outputs.

When to hold off

Today's ansible-core-rs is best suited for:

  • Playbooks whose tasks use the native modules or common collection modules (AnsiballZ handles these)
  • Inventories from INI, YAML, or dynamic scripts
  • Default callback output
  • Vault workflows

Stay on Python ansible-core for now if your workflow relies on:

  • Custom callback plugins beyond the default
  • Custom lookup / filter / test plugins from collections
  • Plugin-style dynamic inventories (aws_ec2, gcp_compute, …)
  • ansible-test for sanity/unit/integration testing (Phase 9 WIP)

Roadmap

The major open items, in roughly priority order, are tracked in MIGRATION_PLAN.md. Highlights:

  • Phase 6 PyO3 bridge: loading non-module Python plugins
  • Phase 10.1: concurrent collection downloads (SSH multiplexing landed)
  • Phase 11: Docker image / PyPI wrapper (Nix flake builds today)

Contributions and bug reports are welcome.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ansible_core_rs-0.4.0.tar.gz (234.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ansible_core_rs-0.4.0-cp311-cp311-manylinux_2_28_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

File details

Details for the file ansible_core_rs-0.4.0.tar.gz.

File metadata

  • Download URL: ansible_core_rs-0.4.0.tar.gz
  • Upload date:
  • Size: 234.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for ansible_core_rs-0.4.0.tar.gz
Algorithm Hash digest
SHA256 25d60fcd8a7fcd26c4bfa2aac6d62a74289354066014bb8657e5cb7b9a96c70d
MD5 bd5179374c9d73fb0b6c67956b7e6355
BLAKE2b-256 6365d19cc13170c1acd7535b849156fde82325c137bb16c4a6150f4b7c29171a

See more details on using hashes here.

File details

Details for the file ansible_core_rs-0.4.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ansible_core_rs-0.4.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 b49ad6f27090fa81bcaeb59cf146537715e75731840588794838f55331403885
MD5 e3d724aeb8562837768d1c6a211d517f
BLAKE2b-256 b9ee2a1f16eaccc4c59fcb560681ba34e61877fcc48cee660fe9a88d54d1ccd5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page