Skip to main content

Pytest session-scoped fixture that works with xdist

Project description

Pytest Shared Session Fixture

image

Session scoped fixture that is shared between all workers in a pytest-xdist run.

from pytest_shared_session_scope import shared_session_scope_json, CleanupToken, SetupToken

@shared_session_scope_json()
def my_fixture():
    data = yield
    if data is SetupToken.FIRST:
        data = expensive_calculation()
    token: CleanupToken = yield data
    if token is CleanupToken.LAST:
      clean_up(data)

It differs from normal fixtures in two ways:

  • If it yields it must yield twice - once to optionally calculate the value, once to yield the value to the test
  • If it yields, a SetupToken or calculated data is send back in the first yield. This can be used to determine if the worker should do any the calulation or it has already been done.
  • If it yields, a CleanupToken is send back in the second yield. This can be used to determine if the worker should do any cleanup.
  • The data needs to be serializable somehow. The default implementation uses the built-in json.dumps/json.loads but custom serialization can be used.

If the fixture "just" returns a value it works too without any modifications.

Why?

This helps avoid one of the most classic pytest pitfalls: session-scoped fixtures are run in each xdist worker. This is a special case of the more general pytest pitfall of thinking that if something works, it will also work with xdist.

Why Not?

The double yield makes them different from normal pytest fixtures and can be confusing. The implementation is a bit hacky - we need to modify the signature of functions to pass fixture values to the inner actual fixture. I'm also not entirely confident cleanup will work correctly in all cases.

Recipes

Non JSON serializable data

The default store uses json.dumps/json.loads which cannot handle all objects. Instead of implementing a custom store for each fixture, you can use the serialize and deserialize arguments

from pytest_shared_session_scope import shared_session_scope_json
from datetime import datetime

def serialize(value: datetime) -> str:
    return value.isoformat()

def deserialize(value: str) -> datetime:
    return datetime.fromisoformat(value)

@shared_session_scope_json(serialize=serialize, deserialize=deserialize)
def my_fixture_return():
    return datetime.now()

You might also want to parse it into something before returning it to the test. This can be useful when you want to yield/return a non-serializable object to the test, but still need to store it in a serializable format.

from pytest_shared_session_scope import shared_session_scope_json, SetupToken

def deserialize(value: str) -> dict:
    return json.loads(value)

def serialize(value: dict) -> str:
    return json.dumps(value)

class Connection:
    def __init__(self, port: int):
        self.port = port

    @classmethod
    def from_dict(cls, data: dict) -> Self:
        return cls(**data)

@shared_session_scope_fixture(
    store=FileStore(),
    parse=Connection.from_dict,
    serialize=serialize,
    deserialize=deserialize,
)
def connection():
    data = yield
    if data is SetupToken.FIRST:
        data = {"port": 123}
    yield data

def test_connection(connection):
    assert connection.port is 123
    assert isinstance(connection, Connection)

The general rules are:

  • The fixture should yield sufficient information (data) to create the object you want to use in the test
  • The parse function should take that data and from it create the object you want to use in the test
  • The serialize function should take data and return a type that can be saved to the store
  • The deserialize function should take the serialized data and return the data you want to parse

In most cases, you don't have to care about this.

Implementing and using a custom store

The default stores saves data as string to a local filsystem. If you want to use a different store, you can implement your own. It needs to follow the protocol defined with pytest_shared_session_scope.types.Store. Mainly it needs to implement three methods:

  • read to read the data from the store
  • write to write the data to the store
  • lock to lock the store to ensure no race conditions.

Usually you want to store the data on the local filesystem. There's a mixin for that: LocalFileStoreMixin. It has a helper method _get_path that returns a path to a file in a temporary directory and you just need to implement read and write methods. The store should be passed to the shared_session_scope_fixture decorator, which the shared_session_scope_json is just a wrapper around. Below is an example of a store that uses Polars to read and write parquet files.

from typing import Any
from pytest_shared_session_scope import shared_session_scope_fixture, SetupToken
import polars as pl

from pytest_shared_session_scope.store import LocalFileStoreMixin
from pytest_shared_session_scope.types import StoreValueNotExists


class PolarsStore(LocalFileStoreMixin):
    def read(self, identifier: str, fixture_values: dict[str, Any]) -> pl.DataFrame:
        path = self._get_path(identifier, fixture_values["tmp_path_factory"])
        try:
            return pl.read_parquet(path)
        except FileNotFoundError:
            raise StoreValueNotExists()

    def write(self, identifier: str, data: pl.DataFrame, fixture_values: dict[str, Any]):
        path = self._get_path(identifier, fixture_values["tmp_path_factory"])
        data.write_parquet(path)


@shared_session_scope_fixture(PolarsStore())
def my_fixture():
    data = yield
    if data is SetupToken.FIRST:
        data = pl.DataFrame({"a": [1, 2, 3]})
    yield data

Attentive readers will notice that this could also be achieved with the default FileStore or even the shared_session_scope_json by creating clever serialization and deserialization functions. However here it's probably simpler to just use a custom store. Implementing this store with deserialize, serialize and parse is left up as an exercise for the reader.

Returning functions

It's a common pattern to return functions from fixtures - for example to register data needed in the cleanup. Instead, use two fixtures - one to calculate the data and one to use it. But remember that the second fixture is run in each worker! So it won't cover all cases.

import pytest
from pytest_shared_session_scope import shared_session_scope_json

@shared_session_scope_json()
def important_ids():
    return [1,2,3]

@pytest.fixture
def cleanup_important_ids(important_ids):
    ids_to_cleanup = []
    def use_id(id_):
      if id_ not in important_ids:
        raise ValueError(f"{id_} not in important_ids!")
      ids_to_cleanup.append(id_)
    yield use_id
    for id in ids_to_cleanup:
      print(f"Cleaning up {id}")

def test_thing_with_ids(important_ids, cleanup_important_ids):
    for id in important_ids:
      # assert thing
      cleanup_important_ids(id)

Using with cache

Pytest has a built-in cache that can be used to store data between runs. This can be useful to avoid recalculating data between runs.

from pytest_shared_session_scope import shared_session_scope_json, SetupToken

@shared_session_scope_json()
def my_fixture(pytestconfig):
    data = yield
    if data is SetupToken.FIRST:
        data = pytestconfig.cache.get("example/value", None)
        if data is None:
            data = {"hey": "data"}
            pytestconfig.cache.set("example/value", data)
    yield data


def test(my_fixture):
    assert my_fixture == {"hey": "data"}

How?

The decorator is a generalization of the guide from the pytest-xdist docs of how to make session scoped fixtures execute only once with the added feature of being able to run cleanup code in the last worker to finish. To summarize, the first worker to request the fixture will calculate it and them persist it in a Store. Other workers will load the data from the Store. If these Stores needs access to other fixtures (say, tmp_path_factory) we modify the signature of the actual wrapped fixture to include these fixtures.

To keep count on what worker is the last to finish, we keep a running track of what tests has been run in each worker (using the pytest_runtest_protocol and config.stash). This information is then yielded back to the worker

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_shared_session_scope-0.4.0.tar.gz (24.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file pytest_shared_session_scope-0.4.0.tar.gz.

File metadata

File hashes

Hashes for pytest_shared_session_scope-0.4.0.tar.gz
Algorithm Hash digest
SHA256 30da6ced4c734bb7cdbc10310da754ca9c8ae75c1384ceb3eda82fa76db647d2
MD5 3c8eac803b1a4700bbfba14d96c7a47e
BLAKE2b-256 73ab54860af6b49a59653dd5d12fa3b42514c410ee708ab8d47b01eee4f1781c

See more details on using hashes here.

File details

Details for the file pytest_shared_session_scope-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pytest_shared_session_scope-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 583508332f0ffdf306fb50487893e4bd4f893caf21778b7b0ea9fad6767fecce
MD5 35bba82265ed40a7d630c1da5cbb9e8b
BLAKE2b-256 d8cac13adedbd72debe817a62bdacfeb3cc93ae7dde37b20dda90b543835bfbe

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page