Skip to main content

Python test selection/execution/reporting tool

Project description

1. Installation

  • From GitHub

    • Clone this repo:
      git clone https://github.com/pjn2work/TestiPy.git
      
    • Change any setting you'd like inside this file default_config.py (this step is optional)
    • Install:
      # Windows
      install.bat
      
      # Linux or Mac
      ./install.sh
      
  • From PyPi

    pip install testipy
    

2. Running TestiPy

  • if you didn't install it, just cloned

    # goto your folder where you've cloned this repo
    cd /where_you_have_cloned_this_repo
    
    # run demo tests indicating where the tests are and using the web reporter
    python testipy/run.py -tf testipy/tests -r web
    
  • if you install

    # run demo tests indicating where the tests are and using the web reporter
    testipy -tf /where_you_have_cloned_this_repo/testipy/tests -r web
    

2.1. Test selection options:

Select Tests (by tags coded in suites and tests with @TAG, @NAME, @LEVEL, @FEATURES)

options (excludes have higher importance over includes):

  -ip         Include Package (ex: -is qa.regression -es qa.regression.dev), can be several
  -ep         Exclude Package
  -is         Include Suite (ex: -is suiteCertificates -is MICROSERVICES -es DEV), can be several
  -es         Exclude Suite
  -it         Include Test (ex: -it REST), can be several
  -et         Exclude Test
  -sb         filename of the StoryBoard to run (ex: -sb storyboard_QA_rest.json -sb "/qa/tests/sb/sb01.json"), can be several
  -ilv        Include tests of level y (ex: -ilv 5 -ilv 8)
  -elv        Exclude tests of level y
  -alv        Include tests above level y (ex: -alv 5)
  -blv        Include tests below level y
  -if         Include tests by @FEATURES tag (ex: -if 850222)
  -ef         Exclude tests by @FEATURES tag
  -itn        Include tests by @TN tag (beginsWith) (ex: -itn 1.3.1.10)
  -etn        Exclude tests by @TN tag (beginsWith)

2.2. Select Reporters

  • options:

    • -reporter or -r add Reporter (ex: -reporter echo -reporter log -reporter web)

      • echo: shows test execution on stdout, and errors in stderr
      • excel: creates an Excel file with test execution summary and test execution details
      • log: shows test execution on .log file, with the same name as the project_name, errors are shown in stderr
      • portalio: ReportPortalIO Web REST DB and reporter:
      • slack: tests results are sent to Slack channel
      • web: tests results can be seen in realtime on a browser. use -r-web-port PORT to specify other than 9204
      • xml: test results will be saved on report.xml file

2.3. Run:

  • options:

    • -rid RunID (ex: -rid 17, if not passed then current hour and minute will be used ex: 2359)
    • -pn ProjectName (ex: -pn jules)
    • -env EnvironmentName to test (ex: -env dev)
    • -rf ResultsFolder (ex: -rf "/qa/test_results/"), where the tests results will be stored
    • -tf TestsFolder (ex: -tf "/qa/tests_scripts/jules/"), full path to where the tests are
    • -repeat Run the exact same pipeline that amount of times (ex: -repeat 3)
    • -st Suite Threads = 1..8 (ex: -st 4, meaning 4 suites can run in parallel)
  • flags

    • --dryrun All tests will run but without really being executed (all of them will end with SKIPPED)
    • --debugcode Disables the try/except on tests so errors are shown
    • --debug-testipy will show the stacktrace for testipy classes
    • --1 Override test definitions of how many times tests will run (ncycle)
    • --prof Create file .prof with profiling data

3. Example of usage:

  • Example of usage:

    python3 run.py -env dev -reporter log -reporter web -rid 1 -tf "/home/testipy/my_test_scripts" -et NO_RUN -it DEV
    
  • Storyboard:

    • If storyboard passed, tests will run by the order defined on json file
    • If no storyboard is passed, then tests will run ordered (DESC) by package name, @PRIO defined on suite, then by @PRIO defined on test itself
  • Results Folder:

    • A folder will be created under the (specified -rf option) composed by: projectName_currentDate_RID (ex: testipy_20201231_00525)
    • Under the folder defined above, subfolders can be created with package_name/suite_name containing the tests results (created by each reporter)
  • Tests not ended:

    • If a test ends without being formally ended (by a testFailed, testSkipped or testPassed), it will be passed by the executor

4. Suite Example

from typing import Dict

from testipy.helpers.handle_assertions import ExpectedError
from testipy.reporter import ReportManager

from pet_store_toolbox import Toolbox


_new_pet = {
                "id": 1,
                "name": "Sissi",
                "category": {
                    "id": 1,
                    "name": "Dogs"
                },
                "photoUrls": [""],
                "tags": [
                    {
                        "id": 0,
                        "name": "Schnauzer"
                    },
                    {
                        "id": 0,
                        "name": "mini"
                    }
                ],
                "status": "available"
            }


class SuitePetStore:
    """
    @LEVEL 1
    @TAG PETSTORE
    @PRIO 2
    """

    def __init__(self):
        self.toolbox = Toolbox()

    # Create a new pet
    def test_create_pet_valid(self, sd: SuiteDetails, rm: ReportManager, ncycles=1, param=None):
        """
        @LEVEL 3
        @PRIO 5
        """
        current_test = rm.startTest(sd)

        data = {
            "control": {"expected_status_code": 200},
            "param": _new_pet,
            "expected_response": _new_pet
        }

        try:
            self.toolbox.post_pet(rm, current_test, data, "create_pet")
        except Exception as ex:
            rm.testFailed(current_test, reason_of_state=str(ex), exc_value=ex)
        else:
            rm.testPassed(current_test, reason_of_state="pet created")

    # Get the pet created before
    def test_get_pet_valid(self, sd: SuiteDetails, rm: ReportManager, ncycles=1, param=None):
        """
        @LEVEL 3
        @PRIO 10
        @ON_SUCCESS 5
        """
        current_test = rm.startTest(sd)

        data = {
            "control": {"expected_status_code": 200},
            "param": _new_pet["id"],
            "expected_response": _new_pet
        }

        try:
            self.toolbox.get_pet(rm, current_test, data, "get_pet")
        except Exception as ex:
            rm.testFailed(current_test, reason_of_state=str(ex), exc_value=ex)
        else:
            rm.testPassed(current_test, reason_of_state="pet fetched")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

testipy-0.10.2.tar.gz (143.2 kB view details)

Uploaded Source

Built Distribution

TestiPy-0.10.2-py3-none-any.whl (165.6 kB view details)

Uploaded Python 3

File details

Details for the file testipy-0.10.2.tar.gz.

File metadata

  • Download URL: testipy-0.10.2.tar.gz
  • Upload date:
  • Size: 143.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for testipy-0.10.2.tar.gz
Algorithm Hash digest
SHA256 f6c6f5597b3b354a579bc460c99c996d3701f6a09b54c2b5e8135e7f0c17c922
MD5 9ca10a7d9d5573e7698629d759839081
BLAKE2b-256 3946d9f238ca2b069786254e3198943e3a075a2d65eb627d1b050cc98fff3bec

See more details on using hashes here.

File details

Details for the file TestiPy-0.10.2-py3-none-any.whl.

File metadata

  • Download URL: TestiPy-0.10.2-py3-none-any.whl
  • Upload date:
  • Size: 165.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for TestiPy-0.10.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c310ccf1dfc7ae31d7fb7522696dbf53777bd03d5ab364a29d305af148d98cb5
MD5 6833d5f65bd296899fd51db1cb57cc29
BLAKE2b-256 d94fd8deb58bf4ae2f1f3903095556a2ca26221275de1b7af9227d8c3eceb05d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page