Skip to main content

Lightweight testing helpers

Project description

coveo-testing

A set of test/pytest helpers to facilitate common routines.

Content in a nutshell:

  • Reusable pytest markers (UnitTest, IntegrationTest)

  • Unique ID generation for tests

  • Multiline logging assertions with includes, excludes, levels and comprehensive assertion output

  • Refactorable unittest.mock.patch('this.module') module references

  • Human-readable (but still customizable) display for parametrized tests

This project is used as the test base for all other projects in this repository.

Therefore, it cannot depend on any of them.

More complex use cases may be implemented in the coveo-testing-extras project. That's also where you can depend on projects that depend on coveo-testing.

pytest markers and auto-registration

This enables code completion on markers.

Three markers are already provided: [UnitTest, Integration, Interactive]

Here's how you can create additional markers:

# /test_some_module/markers.py
import pytest

DockerTest = pytest.mark.docker_test
CloudTest = pytest.mark.cloud_test

ALL_MARKERS = [DockerTest, CloudTest]

You can then import these markers and decorate your test functions accordingly:

# /test_some_module/test_something.py
from coveo_testing.markers import UnitTest, Integration, Interactive
from test_some_module.markers import CloudTest, DockerTest

@UnitTest
def test_unit() -> None:
    ...  # designed to be fast and lightweight, most likely parametrized


@Integration
def test_integration() -> None:
    ...  # combines multiple features to achieve a test


@CloudTest
def test_in_the_cloud() -> None:
    ...  # this could be a post-deployment test, for instance.


@DockerTest
@Integration
def test_through_docker() -> None:
    ... # will run whenever docker tests or integration tests are requested


@Interactive
def test_interactive() -> None:
    ...  # these tests rely on eye-validations, special developer setups, etc  

Pytest will issue a warning when markers are not registered.

To register coveo-testing's markers along with your custom markers, use the provided register_markers method:

# /test_some_module/conftest.py
from _pytest.config import Config
from coveo_testing.markers import register_markers
from test_some_module.markers import ALL_MARKERS

def pytest_configure(config: Config) -> None:
    """This pytest hook is ran once, before collecting tests."""
    register_markers(config, *ALL_MARKERS)

Human-readable unique ID generation

The generated ID has this format:

friendly-name.timestamp.pid.host.executor.sequence

  • friendly-name:

    • provided by you, for your own benefit
  • timestamp:

    • format "%m%d%H%M%S" (month, day, hour, minutes, seconds)
    • computed once, when TestId is imported
  • pid:

    • the pid of the python process
  • host:

    • the network name of the machine
  • executor:

    • the content of the EXECUTOR_NUMBER environment variable
    • returns 'default' when not defined
    • historically, this variable comes from jenkins
    • conceptually, it can be used to help distribute (and identify) tests and executors
  • sequence:

    • Thread-safe
    • Each friendly-name has an isolated sequence that starts at 0
    • Incremented on each new instance
    • Enables support for parallel parametrized tests
from coveo_testing.temporary_resource.unique_id import TestId, unique_test_id


# the friendly name is the only thing you need to specify
test_id = TestId('friendly-name')
str(test_id)
'friendly-name.0202152243.18836.WORKSTATION.default.0'


# you can pass the instance around to share the ID
str(test_id)
'friendly-name.0202152243.18836.WORKSTATION.default.0'


# create new instances to increment the sequence number
test_id = TestId('friendly-name')
str(test_id)
'friendly-name.0202152243.18836.WORKSTATION.default.1'


# use it in parallel parameterized tests
import pytest

@pytest.mark.parametrize('param', (True, False))
def test_param(param: bool, unique_test_id: TestId) -> None:
    # in this case, the friendly name is the function name and
    # the sequence will increase on each parameter
    # test_param.0202152243.18836.WORKSTATION.default.0
    # test_param.0202152243.18836.WORKSTATION.default.1
    ...

multiline logging assertions

Maybe pytest's caplog is enough for your needs, or maybe you need more options. This tool uses in and not in to match strings in a case-sensitive way.

import logging
from coveo_testing.logging import assert_logging

with assert_logging(
        logging.getLogger('logger-name'),
        present=['evidence1', 'evidence2'], 
        absent=[...], 
        level=logging.WARN):
    ...

Human-readable (but still customizable) display for parametrized tests

If you're like me, you typed @pytest.mark.parametrize wrong a couple of times!

Enable IDE completion by using this one instead:

from coveo_testing.parametrize import parametrize

@parametrize('var', (True, False))
def test_var(var: bool) -> None:
    ...

It has one difference vs the pytest one, and it's the way it formats the "parameter name" for each iteration of the test.

Pytest will skip a lot of types and will simply name your test "var0", "var1" and so on. Using this @parametrize instead, the variable's content will be inspected:

from typing import Any
from coveo_testing.parametrize import parametrize
import pytest


class StrMe:
    def __init__(self, var: Any) -> None:
      self.var = var
      
    def __str__(self) -> str:
      return f"Value: {self.var}"


@parametrize('var', [['list', 'display'], [StrMe('hello')]])
def test_param(var: bool) -> None:
    ...

@pytest.mark.parametrize('var', [['list', 'display'], [StrMe('hello')]])
def test_param_from_pytest(var: bool) -> None:
    ...

If you run pytest --collect-only you will obtain the following:

    <Function test_param[list-display]>
    <Function test_param[Value: hello]>
    <Function test_param_from_pytest[var0]>
    <Function test_param_from_pytest[var1]>

Refactorable mock targets

The ref tool has moved to its own package called coveo-ref.

Backward Compatibility

You can still continue using ref from coveo-testing: the backward compatibility patch will not be deprecated.

Migration Guide

If you'd rather use the new package directly:

  • Import the coveo-ref dependency into your project
  • Replace from coveo_testing.mocks import ref by from coveo_ref import ref
  • Exceptions have been moved to coveo_ref.exceptions

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coveo_testing-2.0.14.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

coveo_testing-2.0.14-py3-none-any.whl (9.6 kB view details)

Uploaded Python 3

File details

Details for the file coveo_testing-2.0.14.tar.gz.

File metadata

  • Download URL: coveo_testing-2.0.14.tar.gz
  • Upload date:
  • Size: 9.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.18 Linux/5.15.0-1058-azure

File hashes

Hashes for coveo_testing-2.0.14.tar.gz
Algorithm Hash digest
SHA256 cb8d449214a51277048d5ce22f1022fe6276c39667eaa3f0185a9be8360c9f1b
MD5 be2022530be25cc4e8b977e795cc7339
BLAKE2b-256 d66e7647bbb58f9377e2d6888bee241ce6e768c0dee145199b9b255106e1e295

See more details on using hashes here.

File details

Details for the file coveo_testing-2.0.14-py3-none-any.whl.

File metadata

  • Download URL: coveo_testing-2.0.14-py3-none-any.whl
  • Upload date:
  • Size: 9.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.18 Linux/5.15.0-1058-azure

File hashes

Hashes for coveo_testing-2.0.14-py3-none-any.whl
Algorithm Hash digest
SHA256 ed690afcf398d16dd7fa9b936444eb1e9ac836cc0b891b5fc73665bef6c5b418
MD5 55200d84ca9a97b7bb49e1177d59af38
BLAKE2b-256 a1646cee01f574fcf82b16b796e6ebf42acc405186078df170b2c9f5d87bde43

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page