a pytest plugin that can run both python and robotframework tests while generating robot reports for them
Project description
pytest-robotframework
a pytest plugin that can run both python and robotframework tests while generating robot reports for them
install
pytest should automatically find and activate the plugin once you install it.
pdm add pytest-robotframework --dev
features
write robot tests in python
# you can use both robot and pytest features
from robot.api import logger
from pytest import Cache
from pytest_robotframework import keyword
@keyword # make this function show as a keyword in the robot log
def foo():
...
@mark.slow # gets converted to robot tags
def test_foo(cache: Cache):
foo()
run .robot
tests
to allow for gradual adoption, the plugin also runs regular robot tests as well:
*** Settings ***
test setup setup
*** Test Cases ***
bar
[Tags] asdf key:value
no operation
*** Keywords ***
setup
log ran setup
which is roughly equivalent to the following python code:
# conftest.py
from robot.api import logger
from pytest_robotframework import keyword
def pytest_runtet_setup():
foo()
@keyword
def foo():
logger.info("ran setup")
# test_foo.py
from pytest import mark
@mark.asdf
@mark.key("value")
def test_bar():
...
setup/teardown and other hooks
to define a function that runs for each test at setup or teardown, create a conftest.py
with a pytest_runtest_setup
and/or pytest_runtest_teardown
function:
# ./tests/conftest.py
def pytest_runtest_setup():
log_in()
# ./tests/test_suite.py
def test_something():
"""i am logged in now"""
these hooks appear in the log the same way that the a .robot
file's Setup
and Teardown
options in *** Settings ***
would:
for more information, see writing hook functions. pretty much every pytest hook should work with this plugin but i haven't tested them all. please raise an issue if you find one that's broken.
tags/markers
pytest markers are converted to tags in the robot log:
from pytest import mark
@mark.slow
def test_blazingly_fast_sorting_algorithm():
[1,2,3].sort()
markers like skip
, skipif
and parameterize
also work how you'd expect:
from pytest import mark
@mark.parametrize("test_input,expected", [(1, 8), (6, 6)])
def test_eval(test_input: int, expected: int):
assert test_input == expected
listeners
you can define listeners in your conftest.py
and decorate them with @listener
to register them as global listeners:
# conftest.py
from pytest_robotframework import listener
from robot import model, result
from robot.api.interfaces import ListenerV3
from typing_extensions import override
@listener
class Listener(ListenerV3):
@override
def start_test(self, data: model.TestCase result: result.TestCase):
...
robot suite variables
to set suite-level robot variables, call the set_variables
function at the top of the test suite:
from robot.libraries.BuiltIn import BuiltIn
from pytest_robotframework import set_variables
set_variables(
{
"foo": "bar",
"baz": ["a", "b"],
}
)
def test_variables():
assert BuiltIn().get_variable_value("$foo") == "bar"
set_variables
is equivalent to the *** Variables ***
section in a .robot
file. all variables are prefixed with $
. @
and &
are not required since $
variables can store lists and dicts anyway
config
since this is a pytest plugin, you should avoid using robot options that have pytest equivalents:
instead of... | use... |
---|---|
robot --include tag_name |
pytest -m tag_name |
robot --skip tag_name |
pytest -m "not tag_name" |
robot --test "test name" path/to/test.robot |
pytest path/to/test.robot::"Test Name" |
robot --listener Foo |
@listener decorator |
robot --dryrun |
pytest --collect-only (not exactly the same. you should use a type checker on your python tests as a replacement for robot dryrun) |
robot --exitonfailure |
pytest --maxfail=1 |
robot --rerunfailed |
pytest --lf |
if the robot option you want to use isn't mentioned here, check the pytest command line options and ini options for a complete list of pytest settings as there are probably many missing from this list.
specifying robot options directlty
there are multiple ways you can specify the robot arguments directly. however, arguments that have pytest equivalents should not be set with robot as they will probably cause the plugin to behave incorrectly.
pytest_robot_modify_args
hook
you can specify a pytest_robot_modify_args
hook in your conftest.py
to programmatically modify the arguments
def pytest_robot_modify_args(args: list[str], collect_only: bool, session: Session) -> None:
if not collect_only:
args.extend(["--listener", "Foo"])
note that not all arguments that the plugin passes to robot will be present in the args
list. arguments required for the plugin to function (eg. the plugin's listeners and prerunmodifiers) cannot be viewed or modified with this hook
--robotargs
pytest argument
pytest --robotargs="-d results --listener foo.Foo"
ROBOT_OPTIONS
environment variable
ROBOT_OPTIONS="-d results --listener foo.Foo"
enabling pytest assertions in the robot log
by default, only failed assertions will appear in the log. to make passed assertions show up, you'll have to add enable_assertion_pass_hook = true
to your pytest ini options:
# pyproject.toml
[tool.pytest.ini_options]
enable_assertion_pass_hook = true
limitations with python tests
there are several limitations with using robotframework without the language that this plugin includes workarounds for
making keywords show in the robot log
by default when writing tests in python, the only keywords that you'll see in the robot log are Setup
, Run Test
and Teardown
. this is because robot is not capable of recognizing keywords called outside of robot code. (see this issue)
this plugin has several workarounds for the problem:
@keyword
decorator
if you want a function you wrote to show up as a keyword in the log, decorate it with the pytest_robotframework.keyword
instead of robot.api.deco.keyword
from pytest_robotframework import keyword
@keyword
def foo():
...
pytest functions are patched by the plugin
most of the pytest functions are patched so that they show as keywords in the robot log
def test_foo():
with pytest.raises(ZeroDivisionError):
logger.info(1 / 0)
patching third party functions with keywordify
if you want a function from a third party module/robot library to be displayed as a keyword, you can patch it with the keywordify
function:
# in your conftest.py
from pyest_robotframework import keywordify
import some_module
# patch a function from the module:
keywordify(some_module, "some_function")
# works on classes too:
keywordify(some_module.SomeClass, "some_method")
run_keyword_and_continue_on_failure
doesn't continue after the failure
some robot keywords such as run_keyword_and_continue_on_failure
don't work properly when called from python code.
use a try
/except
statement to handle expected failures instead:
try:
some_keyword_that_fails()
except SomeException:
... # ignore the exception, or re-raise it later
the keyword will still show as failed in the log (as long as it's decorated with pytest_robotframework.keyword
), but it won't effect the status of the test unless the exception is re-raised
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for pytest_robotframework-1.10.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 34bc0adf2a1857f31f60515ba1500f30e58179f2302709c25c07113f5c1ebb38 |
|
MD5 | 0248f5bad36a3c813f46b6678610fb05 |
|
BLAKE2b-256 | 61b5b69dd13e4259f7e55a777474284986cf22ef4783c62eee8f998623dd2ae9 |
Hashes for pytest_robotframework-1.10.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 627312958227cfb792e876fb4d0a579b0e36f942e4b94ed5928bae5c6aa073b8 |
|
MD5 | 1f20f1d13b8df30c55bef8d18a920cc8 |
|
BLAKE2b-256 | 91d5bb13119a97fa8e24d3c224917ace5b944a7903688ccba725105b2b69b348 |