Skip to main content

Saves previous test runs and allow re-execute previous pytest runs to reproduce crashes or flaky tests

Project description

http://img.shields.io/pypi/v/pytest-replay.svg https://anaconda.org/conda-forge/pytest-replay/badges/version.svg https://github.com/ESSS/pytest-replay/workflows/test/badge.svg https://img.shields.io/pypi/pyversions/pytest-replay.svg https://img.shields.io/badge/code%20style-black-000000.svg

Saves previous test runs and allow re-execute previous pytest runs to reproduce crashes or flaky tests


This pytest plugin was generated with Cookiecutter along with @hackebrot’s Cookiecutter-pytest-plugin template.

Features

This plugin helps to reproduce random or flaky behavior when running tests with xdist. pytest-xdist executes tests in a non-predictable order, making it hard to reproduce a behavior seen in CI locally because there’s no convenient way to track which test executed in which worker.

This plugin records the executed node ids by each worker in the directory given by --replay-record-dir=<dir> flag, and a --replay=<file> can be used to re-run the tests from a previous run. For example:

$ pytest -n auto --replay-record-dir=build/tests/replay

This will generate files with each line being a json with the following content: node identification, start time, end time and outcome. It is interesting to note that usually the node id is repeated twice, that is necessary in case of a test suddenly crashes we will still have the record of that test started. After the test finishes, pytest-replay will add another json line with the complete information. That is also useful to analyze concurrent tests which might have some kind of race condition and interfere in each other.

For example worker gw1 will generate a file .pytest-replay-gw1.txt with contents like this:

{"nodeid": "test_foo.py::test[1]", "start": 0.000}
{"nodeid": "test_foo.py::test[1]", "start": 0.000, "finish": 1.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[3]", "start": 1.5}
{"nodeid": "test_foo.py::test[3]", "start": 1.5, "finish": 2.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[5]", "start": 2.5}
{"nodeid": "test_foo.py::test[5]", "start": 2.5, "finish": 3.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[7]", "start": 3.5}
{"nodeid": "test_foo.py::test[7]", "start": 3.5, "finish": 4.5, "outcome": "passed"}
{"nodeid": "test_foo.py::test[8]", "start": 4.5}
{"nodeid": "test_foo.py::test[8]", "start": 4.5, "finish": 5.5, "outcome": "passed"}

If there is a crash or a flaky failure in the tests of the worker gw1, one can take that file from the CI server and execute the tests in the same order with:

$ pytest --replay=.pytest-replay-gw1.txt

Hopefully this will make it easier to reproduce the problem and fix it.

Replaying Multiple Files in Parallel

Version added: 1.7

When you have multiple replay files from a distributed test run (such as .pytest-replay-gw0.txt, .pytest-replay-gw1.txt), you can replay them all at once in parallel with pytest-xdist installed. This is useful when you want to reproduce the exact execution environment that occurred during a CI run with multiple workers.

Simply pass multiple replay files to the --replay option:

$ pytest --replay .pytest-replay-gw0.txt .pytest-replay-gw1.txt

pytest-replay will automatically:

  • Configure pytest-xdist with the appropriate number of workers (one per replay file)

  • Assign each replay file to a dedicated worker using xdist groups

  • Execute tests in parallel while maintaining the order within each replay file

Note: Multiple replay files require pytest-xdist to be installed. If you try to use multiple files without xdist, pytest-replay will show an error message.

Important: When using multiple replay files, you cannot manually specify xdist options like -n, --dist, --numprocesses, or --maxprocesses, as these are automatically configured based on the number of replay files provided.

Additional metadata

Version added: 1.6

In cases where it is necessary to add new metadata to the replay file to make the test reproducible, pytest-replay provides a fixture called replay_metadata that allows new information to be added using the metadata attribute.

Example:

import pytest
import numpy as np
import random

@pytest.fixture
def rng(replay_metadata):
    seed = replay_metadata.metadata.setdefault("seed", random.randint(0, 100))
    return np.random.default_rng(seed=seed)

def test_random(rng):
    data = rng.standard_normal((100, 100))
    assert data.shape == (100, 100)

When using it with pytest-replay it generates a replay file similar to

{"nodeid": "test_bar.py::test_random", "start": 0.000}
{"nodeid": "test_bar.py::test_random", "start": 0.000, "finish": 1.5, "outcome": "passed", "metadata": {"seed": 12}}

FAQ

  1. pytest has its own cache, why use a different mechanism?

    The internal cache saves its data using json, which is not suitable in the advent of a crash because the file will not be readable.

  2. Shouldn’t the ability of selecting tests from a file be part of the pytest core?

    Sure, but let’s try to use this a bit as a separate plugin before proposing its inclusion into the core.

Installation

You can install pytest-replay via pip from PyPI:

$ pip install pytest-replay

Or with conda:

$ conda install -c conda-forge pytest-replay

Contributing

Contributions are very welcome.

Tests can be run with tox if you are using a native Python installation.

To run tests with conda, first create a virtual environment and execute tests from there (conda with Python 3.5+ in the root environment):

$ python -m venv .env
$ .env\scripts\activate
$ pip install -e . pytest-xdist
$ pytest tests

Releases

Follow these steps to make a new release:

  1. Create a new branch release-X.Y.Z from master;

  2. Update CHANGELOG.rst;

  3. Open a PR;

  4. After it is green and approved, push a new tag in the format X.Y.Z;

GitHub Actions will deploy to PyPI automatically.

Afterwards, update the recipe in conda-forge/pytest-replay-feedstock.

License

Distributed under the terms of the MIT license.

Issues

If you encounter any problems, please file an issue along with a detailed description.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_replay-1.7.1.tar.gz (17.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytest_replay-1.7.1-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file pytest_replay-1.7.1.tar.gz.

File metadata

  • Download URL: pytest_replay-1.7.1.tar.gz
  • Upload date:
  • Size: 17.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pytest_replay-1.7.1.tar.gz
Algorithm Hash digest
SHA256 855af4fe381a80519d0a5e3d2c06be4148868d499f0421b141e8004225aa7516
MD5 962189687959ae486f75b1eedd1372da
BLAKE2b-256 4e8f71247a2d1a176f6d226d99bfc9bdfca1fc614bed21dfd79e310b006f0933

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_replay-1.7.1.tar.gz:

Publisher: deploy.yml on ESSS/pytest-replay

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pytest_replay-1.7.1-py3-none-any.whl.

File metadata

  • Download URL: pytest_replay-1.7.1-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for pytest_replay-1.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8e1d502a9a6381f14c3bcfefa035e5278efabb6924fbfaa57336f6c8ab79004c
MD5 2b2baeb4aa56e02b750e5dc0b77cbf14
BLAKE2b-256 33a99d798e0de9dfa9afbac770b5de4d7438bec069b6cc970e7c178d85efe788

See more details on using hashes here.

Provenance

The following attestation bundles were made for pytest_replay-1.7.1-py3-none-any.whl:

Publisher: deploy.yml on ESSS/pytest-replay

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page