Skip to main content

a pytest plugin that can run both python and robotframework tests while generating robot reports for them

Project description

pytest-robotframework

pytest-robotframework is a pytest plugin that generates robotframework reports for tests written in python and allows you to run robotframework tests using pytest

install

Stable Version Conda Version

pytest should automatically find and activate the plugin once you install it.

API documentation

click here

features

write robot tests in python

# you can use both robot and pytest features
from robot.api import logger
from pytest import Cache

from pytest_robotframework import keyword

@keyword  # make this function show as a keyword in the robot log
def foo():
    ...

@mark.slow  # markers get converted to robot tags
def test_foo():
    foo()

run .robot tests

to allow for gradual adoption, the plugin also runs regular robot tests as well:

*** Settings ***
test setup  foo

*** Test Cases ***
bar
    [Tags]  asdf  key:value
    no operation

*** Keywords ***
foo
    log  ran setup

which is roughly equivalent to the following python code:

# conftest.py
from robot.api import logger
from pytest_robotframework import keyword

def pytest_runtet_setup():
    foo()

@keyword
def foo():
    logger.info("ran setup")
# test_foo.py
from pytest import mark

@mark.asdf
@mark.key("value")
def test_bar():
    ...

setup/teardown and other hooks

to define a function that runs for each test at setup or teardown, create a conftest.py with a pytest_runtest_setup and/or pytest_runtest_teardown function:

# ./tests/conftest.py
def pytest_runtest_setup():
    log_in()
# ./tests/test_suite.py
def test_something():
    """i am logged in now"""

these hooks appear in the log the same way that the a .robot file's Setup and Teardown options in *** Settings *** would:

for more information, see writing hook functions. pretty much every pytest hook should work with this plugin but i haven't tested them all. please raise an issue if you find one that's broken.

tags/markers

pytest markers are converted to tags in the robot log:

from pytest import mark

@mark.slow
def test_blazingly_fast_sorting_algorithm():
    [1,2,3].sort()

markers like skip, skipif and parameterize also work how you'd expect:

from pytest import mark

@mark.parametrize("test_input,expected", [(1, 8), (6, 6)])
def test_eval(test_input: int, expected: int):
    assert test_input == expected

image

listeners and suite visitors

listeners

you can define listeners in your conftest.py and decorate them with @listener to register them as global listeners:

# conftest.py
from pytest_robotframework import listener
from robot import model, result
from robot.api.interfaces import ListenerV3
from typing_extensions import override

@listener
class Listener(ListenerV3):
    @override
    def start_test(self, data: model.TestCase result: result.TestCase):
        ...

or if your listener takes arguments in its constructor, you can call it on the instance instead:

# conftest.py
def pytest_sessionstart():
    listener(Listener("foo"))

pre-rebot modifiers

just like listeners, you can define pre-rebot modifiers using the pre_rebot_modifier decorator:

# conftest.py

from pytest_robotframework import pre_rebot_modifier
from robot import model
from robot.api import SuiteVisitor
from robot.utils.misc import printable_name
from typing_extensions import override


@pre_rebot_modifier
class PytestNameChanger(SuiteVisitor):
    """makes pytest test names look like robot test names (eg. `test_do_thing` -> `Do Thing`)"""

    @override
    def start_test(self, test: model.TestCase):
        pytest_prefix = "test_"
        if test.name.startswith(pytest_prefix):
            test.name = printable_name(
                test.name.removeprefix(pytest_prefix), code_style=True
            )

or on an instance:

# conftest.py
def pytest_sessionstart():
    listener(PytestNameChanger())

pre-run modifiers

there is currently no decorator for pre-run modifiers, since they may interfere with the pytest plugin. if you know what you're doing and would like to use a pre-run modifier anyway, you can always define it in the robot arguments.

robot suite variables

to set suite-level robot variables, call the set_variables function at the top of the test suite:

from robot.libraries.BuiltIn import BuiltIn
from pytest_robotframework import set_variables

set_variables(
    {
        "foo": "bar",
        "baz": ["a", "b"],
    }
)

def test_variables():
    assert BuiltIn().get_variable_value("$foo") == "bar"

set_variables is equivalent to the *** Variables *** section in a .robot file. all variables are prefixed with $. @ and & are not required since $ variables can store lists and dicts anyway

config

since this is a pytest plugin, you should avoid using robot options that have pytest equivalents:

instead of... use...
robot --include tag_name pytest -m tag_name
robot --skip tag_name pytest -m "not tag_name"
robot --test "test name" path/to/test.robot pytest path/to/test.robot::"Test Name"
robot --listener Foo @listener decorator
robot --prerebotmodifier Foo @pre_rebot_modifier decorator
robot --dryrun pytest --collect-only (not exactly the same. you should use a type checker on your python tests as a replacement for robot dryrun)
robot --exitonfailure pytest --maxfail=1
robot --rerunfailed pytest --lf

if the robot option you want to use isn't mentioned here, check the pytest command line options and ini options for a complete list of pytest settings as there are probably many missing from this list.

specifying robot options directlty

there are multiple ways you can specify the robot arguments directly. however, arguments that have pytest equivalents should not be set with robot as they will probably cause the plugin to behave incorrectly.

pytest_robot_modify_args hook

you can specify a pytest_robot_modify_args hook in your conftest.py to programmatically modify the arguments

def pytest_robot_modify_args(args: list[str], collect_only: bool, session: Session) -> None:
    if not collect_only:
        args.extend(["--listener", "Foo"])

note that not all arguments that the plugin passes to robot will be present in the args list. arguments required for the plugin to function (eg. the plugin's listeners and prerunmodifiers) cannot be viewed or modified with this hook

--robotargs pytest argument

pytest --robotargs="-d results --listener foo.Foo"

ROBOT_OPTIONS environment variable

ROBOT_OPTIONS="-d results --listener foo.Foo"

enabling pytest assertions in the robot log

by default, only failed assertions will appear in the log. to make passed assertions show up, you'll have to add enable_assertion_pass_hook = true to your pytest ini options:

# pyproject.toml
[tool.pytest.ini_options]
enable_assertion_pass_hook = true

image

limitations with python tests

there are several limitations with using robotframework without the language that this plugin includes workarounds for

making keywords show in the robot log

by default when writing tests in python, the only keywords that you'll see in the robot log are Setup, Run Test and Teardown. this is because robot is not capable of recognizing keywords called outside of robot code. (see this issue)

this plugin has several workarounds for the problem:

@keyword decorator

if you want a function you wrote to show up as a keyword in the log, decorate it with the pytest_robotframework.keyword instead of robot.api.deco.keyword

from pytest_robotframework import keyword

@keyword
def foo():
    ...

pytest functions are patched by the plugin

most of the pytest functions are patched so that they show as keywords in the robot log

def test_foo():
    with pytest.raises(ZeroDivisionError):
        logger.info(1 / 0)

image

patching third party functions with keywordify

if you want a function from a third party module/robot library to be displayed as a keyword, you can patch it with the keywordify function:

# in your conftest.py

from pyest_robotframework import keywordify
import some_module

# patch a function from the module:
keywordify(some_module, "some_function")
# works on classes too:
keywordify(some_module.SomeClass, "some_method")

run_keyword_and_continue_on_failure doesn't continue after the failure

some robot keywords such as run_keyword_and_continue_on_failure don't work properly when called from python code.

use a try/except statement to handle expected failures instead:

try:
    some_keyword_that_fails()
except SomeException:
    ... # ignore the exception, or re-raise it later

the keyword will still show as failed in the log (as long as it's decorated with pytest_robotframework.keyword), but it won't effect the status of the test unless the exception is re-raised

IDE integration

vscode

vscode's builtin python plugin should discover both your python and robot tests by default, and show run buttons next to them:

image

if you use the robotframework-lsp extension, you'll see duplicated tests on .robot files because you have two extensions that can discover them. to fix this, set robot.testView.enabled to false in vscode's settings.

(note: at the time of writing, this option is not yet in the latest version of the extension, so for now you'll need to install this build)

pycharm

pycharm currently does not support pytest plugins for non-python files. see this issue

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytest_robotframework-2.3.1.tar.gz (45.4 kB view hashes)

Uploaded Source

Built Distribution

pytest_robotframework-2.3.1-py3-none-any.whl (27.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page