Skip to main content

Python test selection/execution/reporting tool

Project description

1. Installation

  • From GitHub

    • Clone this repo:
      git clone https://github.com/pjn2work/TestiPy.git
      
    • Change any setting you'd like inside this file default_config.py (this step is optional)
    • Install:
      # Windows
      install.bat
      
      # Linux or Mac
      ./install.sh
      
  • From PyPi

    pip install testipy
    

2. Running TestiPy

  • if you didn't install it, just cloned

    # goto your folder where you've cloned this repo
    cd /where_you_have_cloned_this_repo
    
    # run demo tests indicating where the tests are and using the web reporter
    python testipy/run.py -tf testipy/tests -r web
    
  • if you install

    # run demo tests indicating where the tests are and using the web reporter
    testipy -tf /where_you_have_cloned_this_repo/testipy/tests -r web
    

2.1. Test selection options:

Select Tests (by tags coded in suites and tests with @TAG, @NAME, @LEVEL, @FEATURES)

options (excludes have higher importance over includes):

  -ip         Include Package (ex: -is qa.regression -es qa.regression.dev), can be several
  -ep         Exclude Package
  -is         Include Suite (ex: -is suiteCertificates -is MICROSERVICES -es DEV), can be several
  -es         Exclude Suite
  -it         Include Test (ex: -it REST), can be several
  -et         Exclude Test
  -sb         filename of the StoryBoard to run (ex: -sb storyboard_QA_rest.json -sb "/qa/tests/sb/sb01.json"), can be several
  -ilv        Include tests of level y (ex: -ilv 5 -ilv 8)
  -elv        Exclude tests of level y
  -alv        Include tests above level y (ex: -alv 5)
  -blv        Include tests below level y
  -if         Include tests by @FEATURES tag (ex: -if 850222)
  -ef         Exclude tests by @FEATURES tag
  -itn        Include tests by @TN tag (beginsWith) (ex: -itn 1.3.1.10)
  -etn        Exclude tests by @TN tag (beginsWith)

2.2. Select Reporters

  • options:

    • -reporter or -r add Reporter (ex: -reporter echo -reporter log -reporter web)

      • echo: shows test execution on stdout, and errors in stderr
      • excel: creates an Excel file with test execution summary and test execution details
      • log: shows test execution on .log file, with the same name as the project_name, errors are shown in stderr
      • portalio: ReportPortalIO Web REST DB and reporter:
      • slack: tests results are sent to Slack channel
      • web: tests results can be seen in realtime on a browser
      • xml: test results will be saved on report.xml file

2.3. Run:

  • options:

    • -rid RunID (ex: -rid 17, if not passed than current hour and minute will be used ex: 2359)
    • -pn ProjectName (ex: -pn jules)
    • -env EnvironmentName to test (ex: -env dev)
    • -rf ResultsFolder (ex: -rf "/qa/test_results/"), where the tests results will be stored
    • -tf TestsFolder (ex: -tf "/qa/tests_scripts/jules/"), full path to where the tests are
    • -repeat Run the exact same pipeline that amount of times (ex: -repeat 3)
    • -st Suite Threads = 1..8 (ex: -st 4, meaning 4 suites can run in parallel)
  • flags

    • --dryrun All tests will run but without really being executed (all of them will end with SKIPPED)
    • --debugcode Disables the try/except on tests so errors are shown
    • --debug-testipy will show the stacktrace for testipy classes
    • --1 Override test definitions of how many times tests will run (ncycle)
    • --prof Create file .prof with profiling data

3. Example of usage:

  • Example of usage:

    python3 run.py -env dev -reporter log -reporter web -rid 1 -tf "/home/testipy/my_test_scripts" -et NO_RUN -it DEV
    
  • Storyboard:

    • If storyboard passed, tests will run by the order defined on json file
    • If no storyboard is passed, then tests will run ordered (DESC) by package name, @PRIO defined on suite, then by @PRIO defined on test itself
  • Results Folder:

    • A folder will be created under the (specified -rf option) composed by: projectName_currentDate_RID (ex: testipy_20201231_00525)
    • Under the folder defined above, subfolders can be created with package_name/suite_name containing the tests results (created by each reporter)
  • Tests not ended:

    • If a test ends without being formally ended (by a testFailed, testSkipped or testPassed), it will be passed by the executor

4. Suite Example

from typing import Dict

from testipy.helpers.handle_assertions import ExpectedError
from testipy.reporter import ReportManager

from pet_store_toolbox import Toolbox


_new_pet = {
                "id": 1,
                "name": "Sissi",
                "category": {
                    "id": 1,
                    "name": "Dogs"
                },
                "photoUrls": [""],
                "tags": [
                    {
                        "id": 0,
                        "name": "Schnauzer"
                    },
                    {
                        "id": 0,
                        "name": "mini"
                    }
                ],
                "status": "available"
            }


class SuitePetStore:
    """
    @LEVEL 1
    @TAG PETSTORE
    @PRIO 2
    """

    def __init__(self):
        self.toolbox = Toolbox()

    # Create a new pet
    def test_create_pet_valid(self, sd: SuiteDetails, rm: ReportManager, ncycles=1, param=None):
        """
        @LEVEL 3
        @PRIO 5
        """
        current_test = rm.startTest(sd)

        data = {
            "control": {"expected_status_code": 200},
            "param": _new_pet,
            "expected_response": _new_pet
        }

        try:
            self.toolbox.post_pet(rm, current_test, data, "create_pet")
        except Exception as ex:
            rm.testFailed(current_test, reason_of_state=str(ex), exc_value=ex)
        else:
            rm.testPassed(current_test, reason_of_state="pet created")

    # Get the pet created before
    def test_get_pet_valid(self, sd: SuiteDetails, rm: ReportManager, ncycles=1, param=None):
        """
        @LEVEL 3
        @PRIO 10
        @ON_SUCCESS 5
        """
        current_test = rm.startTest(sd)

        data = {
            "control": {"expected_status_code": 200},
            "param": _new_pet["id"],
            "expected_response": _new_pet
        }

        try:
            self.toolbox.get_pet(rm, current_test, data, "get_pet")
        except Exception as ex:
            rm.testFailed(current_test, reason_of_state=str(ex), exc_value=ex)
        else:
            rm.testPassed(current_test, reason_of_state="pet fetched")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

testipy-0.10.1.tar.gz (142.3 kB view details)

Uploaded Source

Built Distribution

TestiPy-0.10.1-py3-none-any.whl (164.5 kB view details)

Uploaded Python 3

File details

Details for the file testipy-0.10.1.tar.gz.

File metadata

  • Download URL: testipy-0.10.1.tar.gz
  • Upload date:
  • Size: 142.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for testipy-0.10.1.tar.gz
Algorithm Hash digest
SHA256 04d4b69bbb543200156ab8bbe1e3bed36cb7951015374bea17cfcf059a09d235
MD5 c8baea7292b0b7e5bbd6c2cc2b6d1769
BLAKE2b-256 10f51cdf6be6a20c50b5a63913ccb27eecbbe4c8d8672f0444a4c813621c165a

See more details on using hashes here.

File details

Details for the file TestiPy-0.10.1-py3-none-any.whl.

File metadata

  • Download URL: TestiPy-0.10.1-py3-none-any.whl
  • Upload date:
  • Size: 164.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for TestiPy-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 966525558b1aa873189d5db1d3f579aad052a0d102f9c9cec9dacf02b190e05e
MD5 129ce4f78d5223cc530754d707fab556
BLAKE2b-256 795b6d3e458ffa3cf80183c8f54143e588cd63be00e29c9edbadb26e38aae155

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page