Skip to main content

A pytest plugin for load balancing test suites

Project description

pytest-job-selection

PyPI version

Python versions

By Arvid Jakobsson.

pytest-job-selection is a pytest plugin for load balancing test suites.

In short, the plugin provides a new pytest argument --job such that running pytest --job X/Y [tests...] (where 1 <= X <= Y) groups the selected tests in Y jobs and then executes the tests of job X. By default, the plugin attempts to create jobs where the number of test cases are approximately close to each other.

To instead obtain jobs whose run time are close to each other, a JUnit XML file can be provided through the --prev-junit-xml junit.xml argument. In this mode, the plugin will balance jobs based on the previously reported run time of test cases in junit.xml. Test cases lacking an entry in junit.xml are given a default run time, and cases in junit.xml that are not selected are ignored. See JUnit XML files for more information on this feature.

This plugin enables convenient parallel execution of large pytest suites in CIs such as GitLab CI using parallel jobs. See the section GitLab CI integration for more details.

This plugin is inspired by a similar functionality in Tezt, an OCaml test framework.

Contents

Features

  • Adds a --job pytest argument that executes a subset of selected test.
  • Adds a --prev-junit-xml <junit.xml> pytest argument. When supplied, the plugin heuristically attempts to balance test jobs using a greedy knapsack algorithm into jobs of even runtime, using the timing information in junit.xml. This argument can only be given once, but JUnit XML files can trivially be merged.
  • Adds a --jobs-dry-run pytest argument that outputs debug information on test balancing.

Requirements

In addition, cram is used to test the plugin.

Installation

You can install pytest-job-selection via pip from PyPI:

$ pip install pytest-job-selection

Usage

The following command should suffice for most use cases:

pytest --prev-junit-xml junit.xml --job X/Y [tests...]

This will group selected tests into Y jobs. It uses a balancing heuristic to group selected tests, based on the previously recorded timing information in the JUnit XML file junit.xml. It then executes all tests in job X. Jobs are 1-indexed, so that job 1 is the first job and job Y is the last one. In other words, executing:

pytest --prev-junit-xml junit.xml --job 1/Y [tests...]
pytest --prev-junit-xml junit.xml --job 2/Y [tests...]
...
pytest --prev-junit-xml junit.xml --job Y/Y [tests...]

Will execute the same tests as:

pytest [tests...]

The plugin can be used without passing --prev-junit-xml. In this case, jobs are instead balanced by number of tests.

JUnit XML files

See the pytest documentation for more information on JUnit XML files. In short, an JUnit XML file can be obtained file by executing a test suite with the --junitxml=junit.xml argument:

pytest --junitxml=junit.xml [tests...]

If a junit.xml is not provided to the pytest-job-selection plugin using --prev-junit-xml, then its batching heuristic will assume that all test cases have the same running time, and consequently attempts to create batches with a balanced number of test cases.

If a test case is not present in the provided junit.xml, for instance if a new test case has been added since junit.xml was generated, then a default run time of 1 minute will be assumed. Conversely, test cases present junit.xml which are not selected for execution, for instance test cases that has been removed since junit.xml was generated, have no impact on balancing.

Consequently, the junit.xml does not have to be up-to-date at each run, but as a test suite grows, it will become gradually less well-balanced. It is therefore good practice to update the file at regular intervals.

Dry run

You can simulate balancing run with the --jobs-dry-run --job X/Y flag. This will collect and group tests, and then output:

  • The list of jobs with:

    • weight: its expected run time as per the previously recorded timing information
    • #classes: the number of test classes/modules in this job
  • The full list of test classes/modules sorted by job appartenance and the weight of that test class/module (its weight is sum of the running time of all test classes in that class as per the previously recorded timing information)

  • Job statistics, including:

    • the total number of jobs;
    • the minimum, maximum and average job weight;
    • the minimum, maximum and average number of classes/modules per job.
  • The test classes/modules of the currently selected jobs and their weight.

  • A list of orphaned test classes. These are test classes that appear in the JUnit XML file supplied with --prev-junit-xml but which does not correspond to any selected test. The presence of orphans indicates that the JUnit XML file may be out-of-date, but it will not impact balancing.

A Worked Example

We will use a simple dummy test example/example_test.py:

from time import sleep

class TestExampleA:
    def test_a(self):
        sleep(1)

class TestExampleB:
    def test_b(self):
        sleep(2)

class TestExampleC:
    def test_c(self):
        sleep(1)

class TestExampleD:
    def test_d(self):
        sleep(2)

This module contains four test classes that do nothing but sleep for a given period of time. We can first try run these tests in two jobs without giving the job selection plugin any previous timings on which to base balancing:

$ pytest --job 1/2 example/example_test.py -v
example/example_test.py::TestExampleA::test_a PASSED [ 50%]
example/example_test.py::TestExampleC::test_c PASSED [100%]
$ pytest --job 2/2 example/example_test.py -v
example/example_test.py::TestExampleB::test_b PASSED [ 50%]
example/example_test.py::TestExampleD::test_d PASSED [100%]

This groups TestExampleA with TestExampleC and TestExampleB with TestExampleD. Each job contains an even number of test classes, but the jobs are unbalanced in terms of run time, as the first job will run in ~2 seconds while the second will run in ~4 seconds.

Note that you can also preview the balancing using the pytest flag --collect-only (here in addition to the --quiet flag for terse output):

$ pytest --job 1/2 example/example_test.py --collect-only --quiet
(job selection: 1/2 with 4 timings from junit.xml)
example/example_test.py::TestExampleA::test_a
example/example_test.py::TestExampleB::test_b

4 tests collected in 0.01s
$ pytest --job 2/2 example/example_test.py --collect-only --quiet
(job selection: 2/2 with 4 timings from junit.xml)
example/example_test.py::TestExampleC::test_c
example/example_test.py::TestExampleD::test_d

4 tests collected in 0.01s

To balance the jobs based on the expected runtime of individual test cases, we record a junit.xml file:

$ pytest --junitxml=junit.xml example/example_test.py

We can inspect junit.xml and verify that it contains the expected timings:

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
  <testsuite name="pytest" errors="0" failures="0" skipped="0" tests="4" time="6.040">
    <testcase classname="example.example_test.TestExampleA" name="test_a" time="1.002"/>
    <testcase classname="example.example_test.TestExampleB" name="test_b" time="2.002"/>
    <testcase classname="example.example_test.TestExampleC" name="test_c" time="1.003"/>
    <testcase classname="example.example_test.TestExampleD" name="test_d" time="2.004"/>
  </testsuite>
</testsuites>

And then feed the recording into the plugin:

$ pytest --prev-junit-xml junit.xml --job 1/2 example/example_test.py -v
example/example_test.py::TestExampleA::test_a PASSED [ 50%]
example/example_test.py::TestExampleD::test_d PASSED [100%]
$ pytest --prev-junit-xml junit.xml --job 2/2 example/example_test.py -v
example/example_test.py::TestExampleB::test_b PASSED [ 50%]
example/example_test.py::TestExampleC::test_c PASSED [100%]

This time, TestExampleA is grouped with TestExampleD and TestExampleB is grouped with TestExampleC, giving both jobs an expected runtime of ~3 seconds each.

Finally, we can do a dry run with `--jobs-dry-run`` to obtain an overview of the obtained balancing:

$ pytest --prev-junit-xml junit.xml --job 2/2 --jobs-dry-run example/example_test.py
collecting ... (job selection: 2/2 with 4 timings from junit.xml)
dry run
Jobs: weight and contents
job       weight   #classes
1        0:00:03          2
2        0:00:03          2

Jobs: weight and full contents
job                                  class    weight
1        example.example_test.TestExampleD   0:00:02
1        example.example_test.TestExampleA   0:00:01
2        example.example_test.TestExampleB   0:00:02
2        example.example_test.TestExampleC   0:00:01

Jobs: statistics
jobs_total        weight: avg       min       max   #classes: avg   min   max
jobs_total=2          0:00:03   0:00:03   0:00:03             2.0     2     2
Can add 0:00:00.001000 without increasing wall-time.

Slowest classes (top 10):
weight                                   class
0:00:02      example.example_test.TestExampleD
0:00:02      example.example_test.TestExampleB
0:00:01      example.example_test.TestExampleC
0:00:01      example.example_test.TestExampleA

Would run test classes in job 2/2:
class                                   weight
example.example_test.TestExampleB      0:00:02
example.example_test.TestExampleC      0:00:01

Balancing Heuristic

The balancing heuristic is based on a greedy solution to the knapsack problem. Each test that is missing a previously recorded run time provided through --prev-junit-xml will be assigned a default runtime of 1 minute. If no previously recorded run times are provided, then this applies to all test cases, and consequently the heuristic will balance jobs based only on the number of assigned test cases (i.e., attempting to create jobs whose number of test cases are close to each other).

GitLab CI integration

This plugin can conveniently be used in GitLab CI to exploit parallel jobs.

For instance, to parallelize a job like:

pytest:
  script:
    - pytest tests/

Install the plugin in the CI, make sure that a junit.xml is available in the repository at e.g. tests/junit.xml and change the job to:

pytest:
  parallel: 10
  script:
    - pytest --prev-junit-xml tests/junit.xml --job ${CI_NODE_INDEX}/${CI_NODE_TOTAL}
      "--junitxml=reports/report_${CI_NODE_INDEX}_${CI_NODE_TOTAL}.xml"
      tests/
  artifacts:
    paths:
      - reports
    when: always

This will split the pytest job into 10 parallel jobs. The --junitxml argument has also been added to the pytest command so that new JUnit XML recordings are produced in the CI and then stored as artifacts using the artifacts stanza.

This last point helps when rebalancing the suite, which you can now do by downloading the recordings from the CI, merging them and commiting them to tests/junit.xml. In the next section, we describe the included script glci_jobs_fetch_reports that can be used to partially automate this process.

Retrieving JUnit XML files from GitLab CI

Updating the junit.xml file used for balancing becomes a hassle when the number of jobs grow. This plugin installs a script glci_jobs_fetch_reports (located in scripts/jobs_fetch_reports.py in this repository) that can be used to this effect. For usage information, call glci_jobs_fetch_reports --help.

Limitations

Test classes

This plugin balances tests at the granularity of modules or test classes. All test cases of the same class will always execute in the same job. Similarly, all test cases in a module that do not correspond to class methods will execute in the same job.

Missing timings

If timings are missing for test case, then the balancer will silently assume its running time is 1 minute.

Empty jobs

If there are more jobs than test classes or modules to balance, then at least one job will be empty. In this case, pytest will exit with a non-zero error code.

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the MIT license, "pytest-job-selection" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest-job-selection-0.1.1.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

pytest_job_selection-0.1.1-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file pytest-job-selection-0.1.1.tar.gz.

File metadata

  • Download URL: pytest-job-selection-0.1.1.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for pytest-job-selection-0.1.1.tar.gz
Algorithm Hash digest
SHA256 35a7e45ebc8e34c577771c095274f9c8f9b538f93b4fda1e7cf452f189aa1d45
MD5 6e2765d9015c2c78d6caa65f8944ce61
BLAKE2b-256 fda7bfb84c37ab6fd6e5abee875e53bf51b1c6abacfe5cdc430ed7f68f7de346

See more details on using hashes here.

File details

Details for the file pytest_job_selection-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for pytest_job_selection-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 440f270bd5d248ad8d4ca48814f2579bd82515c952c1e6963906798e79dcd7de
MD5 0d192ae30d20abeee3d7a2f6f31efab4
BLAKE2b-256 4b23eef5f480c152f67d69c07a898320721db699d59839a13b94c96e1f5fdf96

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page