py.test xdist plugin for distributed testing and loop-on-failing modes
The pytest-xdist plugin extends py.test with some unique test execution modes:
Before running tests remotely, py.test efficiently “rsyncs” your program source code to the remote place. All test results are reported back and displayed to your local terminal. You may specify different Python versions and interpreters.
Install the plugin with:
easy_install pytest-xdist # or pip install pytest-xdist
or use the package in develop/in-place mode with a checkout of the pytest-xdist repository
python setup.py develop
To send tests to multiple CPUs, type:
py.test -n NUM
Especially for longer running tests or tests requiring a lot of IO this can lead to considerable speed ups. This option can also be set to auto for automatic detection of the number of CPUs.
If a test crashes the interpreter, pytest-xdist will automatically restart that slave and report the failure as usual. You can use the --max-slave-restart option to limit the number of slaves that can be restarted, or disable restarting altogether using --max-slave-restart=0.
To instantiate a python2.5 sub process and send tests to it, you may type:
py.test -d --tx popen//python=python2.5
This will start a subprocess which is run with the “python2.5” Python interpreter, found in your system binary lookup path.
If you prefix the –tx option value like this:
then three subprocesses would be created and tests will be load-balanced across these three processes.
If you have tests involving C or C++ libraries you might have to deal with tests crashing the process. For this case you may use the boxing options:
which will run each test in a subprocess and will report if a test crashed the process. You can also combine this option with running multiple processes to speed up the test run and use your CPU cores:
py.test -n3 --boxed
this would run 3 testing subprocesses in parallel which each create new boxed subprocesses for each test.
Suppose you have a package mypkg which contains some tests that you can successfully run locally. And you have a ssh-reachable machine myhost. Then you can ad-hoc distribute your tests by typing:
py.test -d --tx ssh=myhostpopen --rsyncdir mypkg mypkg
This will synchronize your
mypkg package directory
to an remote ssh account and then locally collect tests
and send them to remote places for execution.
You can specify multiple
to be sent to the remote side.
For py.test to collect and send tests correctly
you not only need to make sure all code and tests
directories are rsynced, but that any test (sub) directory
also has an
__init__.py file because internally
py.test references tests as a fully qualified python
module path. You will otherwise get strange errors
during setup of the remote side.
You can specify multiple
--rsyncignore glob patterns
to be ignored when file are sent to the remote side.
There are also internal ignores:
.*, *.pyc, *.pyo, *~
Those you cannot override using rsyncignore command-line or
Download the single-module socketserver.py Python program and run it like this:
It will tell you that it starts listening on the default port. You can now on your home machine specify this new socket host with something like this:
py.test -d --tx socket=192.168.1.102:8888 --rsyncdir mypkg mypkg
The basic command to run tests on multiple platforms is:
py.test --dist=each --tx=spec1 --tx=spec2
If you specify a windows host, an OSX host and a Linux environment this command will send each tests to all platforms - and report back failures from all platforms at once. The specifications strings use the xspec syntax.
If you need to determine the identity of a worker process in a test or fixture, you may use the worker_id fixture to do so:
@pytest.fixture() def user_account(worker_id): """ use a different account in each xdist worker """ return "account_%s" % worker_id
When xdist is disabled (running with -n0 for example), then worker_id will return "master".
Additionally, worker processes have the following environment variables defined:
The information about the worker_id in a test is stored in the TestReport as well, under worker_id attribute.
New in version 1.15.
pytest (since version 2.0) supports ini-style cofiguration. You can for example make running with three subprocesses your default like this:
[pytest] addopts = -n3
You can also add default environments like this:
[pytest] addopts = --tx ssh=myhost//python=python2.5 --tx ssh=myhost//python=python2.6
and then just type:
to run tests in each of the environments.
In a tox.ini or setup.cfg file in your root project directory you may specify directories to include or to exclude in synchronisation:
[pytest] rsyncdirs = . mypkg helperpkg rsyncignore = .hg
These directory specifications are relative to the directory where the configuration file was found.
Please use the pytest issue tracker for reporting bugs in this plugin.