pytest xdist plugin for distributed testing and loop-on-failing modes
Project description
xdist: pytest distributed testing plugin
The pytest-xdist plugin extends pytest with some unique test execution modes:
test run parallelization: if you have multiple CPUs or hosts you can use those for a combined test run. This allows to speed up development or to use special resources of remote machines.
--looponfail: run your tests repeatedly in a subprocess. After each run pytest waits until a file in your project changes and then re-runs the previously failing tests. This is repeated until all tests pass after which again a full run is performed.
Multi-Platform coverage: you can specify different Python interpreters or different platforms and run tests in parallel on all of them.
Before running tests remotely, pytest efficiently “rsyncs” your program source code to the remote place. All test results are reported back and displayed to your local terminal. You may specify different Python versions and interpreters.
If you would like to know how pytest-xdist works under the covers, checkout OVERVIEW.
Installation
Install the plugin with:
pip install pytest-xdist
or use the package in develop/in-place mode with a checkout of the pytest-xdist repository
pip install --editable .
Speed up test runs by sending tests to multiple CPUs
To send tests to multiple CPUs, type:
pytest -n NUM
Especially for longer running tests or tests requiring a lot of I/O this can lead to considerable speed ups. This option can also be set to auto for automatic detection of the number of CPUs.
If a test crashes the interpreter, pytest-xdist will automatically restart that worker and report the failure as usual. You can use the --max-worker-restart option to limit the number of workers that can be restarted, or disable restarting altogether using --max-worker-restart=0.
By default, the -n option will send pending tests to any worker that is available, without any guaranteed order, but you can control this with these options:
--dist=loadscope: tests will be grouped by module for test functions and by class for test methods, then each group will be sent to an available worker, guaranteeing that all tests in a group run in the same process. This can be useful if you have expensive module-level or class-level fixtures. Currently the groupings can’t be customized, with grouping by class takes priority over grouping by module. This feature was added in version 1.19.
--dist=loadfile: tests will be grouped by file name, and then will be sent to an available worker, guaranteeing that all tests in a group run in the same worker. This feature was added in version 1.21.
Running tests in a Python subprocess
To instantiate a python3.5 subprocess and send tests to it, you may type:
pytest -d --tx popen//python=python3.5
This will start a subprocess which is run with the python3.5 Python interpreter, found in your system binary lookup path.
If you prefix the –tx option value like this:
--tx 3*popen//python=python3.5
then three subprocesses would be created and tests will be load-balanced across these three processes.
Running tests in a boxed subprocess
This functionality has been moved to the pytest-forked plugin, but the --boxed option is still kept for backward compatibility.
Sending tests to remote SSH accounts
Suppose you have a package mypkg which contains some tests that you can successfully run locally. And you have a ssh-reachable machine myhost. Then you can ad-hoc distribute your tests by typing:
pytest -d --tx ssh=myhostpopen --rsyncdir mypkg mypkg
This will synchronize your mypkg
package directory
to a remote ssh account and then locally collect tests
and send them to remote places for execution.
You can specify multiple --rsyncdir
directories
to be sent to the remote side.
You can specify multiple --rsyncignore
glob patterns
to be ignored when file are sent to the remote side.
There are also internal ignores: .*, *.pyc, *.pyo, *~
Those you cannot override using rsyncignore command-line or
ini-file option(s).
Sending tests to remote Socket Servers
Download the single-module socketserver.py Python program and run it like this:
python socketserver.py
It will tell you that it starts listening on the default port. You can now on your home machine specify this new socket host with something like this:
pytest -d --tx socket=192.168.1.102:8888 --rsyncdir mypkg mypkg
Running tests on many platforms at once
The basic command to run tests on multiple platforms is:
pytest --dist=each --tx=spec1 --tx=spec2
If you specify a windows host, an OSX host and a Linux environment this command will send each tests to all platforms - and report back failures from all platforms at once. The specifications strings use the xspec syntax.
Identifying the worker process during a test
New in version 1.15.
If you need to determine the identity of a worker process in a test or fixture, you may use the worker_id fixture to do so:
@pytest.fixture()
def user_account(worker_id):
""" use a different account in each xdist worker """
return "account_%s" % worker_id
When xdist is disabled (running with -n0 for example), then worker_id will return "master".
Additionally, worker processes have the following environment variables defined:
PYTEST_XDIST_WORKER: the name of the worker, e.g., "gw2".
PYTEST_XDIST_WORKER_COUNT: the total number of workers in this session, e.g., "4" when -n 4 is given in the command-line.
The information about the worker_id in a test is stored in the TestReport as well, under the worker_id attribute.
Acessing sys.argv from the master node in workers
To access the sys.argv passed to the command-line of the master node, use request.config.workerinput["mainargv"].
Specifying test exec environments in an ini file
You can use pytest’s ini file configuration to avoid typing common options. You can for example make running with three subprocesses your default like this:
[pytest]
addopts = -n3
You can also add default environments like this:
[pytest]
addopts = --tx ssh=myhost//python=python3.5 --tx ssh=myhost//python=python3.6
and then just type:
pytest --dist=each
to run tests in each of the environments.
Specifying “rsync” dirs in an ini-file
In a tox.ini or setup.cfg file in your root project directory you may specify directories to include or to exclude in synchronisation:
[pytest]
rsyncdirs = . mypkg helperpkg
rsyncignore = .hg
These directory specifications are relative to the directory where the configuration file was found.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file alauda-xdist-1.30.1.tar.gz
.
File metadata
- Download URL: alauda-xdist-1.30.1.tar.gz
- Upload date:
- Size: 27.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/28.8.0 requests-toolbelt/0.9.1 tqdm/4.23.0 CPython/3.6.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f5d5e67474ff652eed521c73fa81e7afe6a42028fbf0957196196887496ff163 |
|
MD5 | 211834dbe57eeb6a603b3afe5fe7c046 |
|
BLAKE2b-256 | a4e4d261c40c21308718537dcb10185368e23fc447850f229b6eed9c239fa3e7 |
File details
Details for the file alauda_xdist-1.30.1-py2.py3-none-any.whl
.
File metadata
- Download URL: alauda_xdist-1.30.1-py2.py3-none-any.whl
- Upload date:
- Size: 36.2 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/28.8.0 requests-toolbelt/0.9.1 tqdm/4.23.0 CPython/3.6.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 136822617a46f4528db2965081a71390836863753d4db62a9319570adcb533c2 |
|
MD5 | 4ea8d89f8d6e030c10840e54eb98a4c4 |
|
BLAKE2b-256 | 6cb96a75a18b8a81077082191d94af7325114ff4d14545e3162d4f36ae56f89a |