Skip to main content

Teko CLI tools

Project description

Teko CLI Tools

This is a Python package that contains some classes and CLI tools to be used during development process at Teko Vietnam (https://teko.vn).

List of tools:

  • Teko Jira Tools
  • Teko pytest test report fixture (coming soon)
  • Teko CodeSignal Tournament Result Fetching (coming soon)
  • More to come...

1 Installation

To use this tools, Python 3.6+ should be instaled.

Install or upgrade to the latest version:

$ pip install --upgrade teko-cli

After successful installation, you should be able to use teko command in your console terminal:

$ teko

You should see this kind of help message:

Usage: teko [OPTIONS] COMMAND [ARGS]...

Options:
  --install-completion [bash|zsh|fish|powershell|pwsh]
                                  Install completion for the specified shell.
  --show-completion [bash|zsh|fish|powershell|pwsh]
                                  Show completion for the specified shell, to
                                  copy it or customize the installation.

  --help                          Show this message and exit.

Commands:
  cs
  jira

Using Teko OAS tool

Open Api Specification tool

  • OpenAPI spec parser and validator
  • OpenAPI spec comparator

Using Teko Jira tool

2.1 Configure Jira credential

This tool use username/password authentication to Jira server, Confluence documentation and working with one Jira project at the same time. Jira server authentication uses 4 environment variables below. You can set these variables to the your environment, or save it to .env file in working directory:

Sample content of .env file:

JIRA_SERVER=jira.teko.vn
JIRA_PROJECT_KEY=<project-key>
JIRA_USERNAME=<jira-username>
JIRA_PASSWORD=<jira-password>
CONFLUENCE_USERNAME=<confluence-username>
CONFLUENCE_PASSWORD=<confluence-password>

2.2 Submit (create) list of testcases to a Jira Project

To push testcases to a Jira Project, you should prepare a .yaml or a .json file, containing list of testcases to be created in Jira, then use following command:

$ teko jira create-tests {testcase_file}

To see help message, use --help option: teko jira create-tests --help:

Usage: teko jira create-tests [OPTIONS] FILE



Arguments:
  FILE  The name of a testcase definition file  [required]

Options:
  --help  Show this message and exit.

Note: This tool uses test case name as identification, so existing tests with same name will be updated with the latest information.

Sample testcases file

  • .yaml file
- name: Testcase 01
  issueLinks: [TESTING-4]
  objective: Test 01 Objective
  precondition: Test 01 Precondition
  testScript:
    type: PLAIN_TEXT
    text: >
      - init x and y <br/>
      - call func add(x, y)
  labels: [ABC, XYZ]
- name: Testcase 02
  issueLinks: [TESTING-4, TESTING-5]
  objective: Test 02 Objective
  precondition: Test 02 Precondition
  priority: Normal
  status: Draft
  testScript:
    type: STEP_BY_STEP
    steps:
    - description: <i>Step 1</i>
      testData: (x,y) = (1,3)
      expectedResult: <strong>4</strong>
    - description: Step 2
      testData: <code>(x,y) = (4,5)</code>
      expectedResult: 9
  • Equivalent .json file:
[
  {
    "name": "Testcase 01",
    "issueLinks": [
      "TESTING-4"
    ],
    "objective": "Test 01 Objective",
    "precondition": "Test 01 Precondition",
    "testScript": {
      "type": "PLAIN_TEXT",
      "text": "- init x and y <br/> - call func add(x, y)\n"
    },
    "labels": ["ABC", "XYZ"]
  },
  {
    "name": "Testcase 02",
    "issueLinks": [
      "TESTING-4",
      "TESTING-5"
    ],
    "objective": "Test 02 Objective",
    "precondition": "Test 02 Precondition",
    "priority": "Normal",
    "status": "Draft",
    "testScript": {
      "type": "STEP_BY_STEP",
      "steps": [
        {
          "description": "<i>Step 1</i>",
          "testData": "(x,y) = (1,3)",
          "expectedResult": "<strong>4</strong>"
        },
        {
          "description": "Step 2",
          "testData": "<code>(x,y) = (4,5)</code>",
          "expectedResult": "9"
        }
      ]
    }
  }
]

2.3 Create test cycle (testrun result) in a Jira Project

To create a testrun report (cycle) to a Jira Project, you should prepare a .yaml or a .json file, containing list of testcases and their result to be created in Jira, then use following command:

$ teko jira create-cycle {testcase_file}

Sample testrun (cycle) file

  • .yaml file
- name: Testcase 01
  testrun_folder: /Sprint 1
  testrun_status: Pass
  testrun_environment: Dev
  testrun_comment: The test has passed successfully
  testrun_duration: 300000
  testrun_date: "2020-12-31T23:59:59Z"  
- name: Testcase 02
  testrun_folder: /Sprint 2
  testrun_status: Fail
  testrun_environment: test1
  testrun_comment: The test has failed on some automation tool procedure
  testrun_duration: 30000
  testrun_date: "2020-12-31T23:59:59Z" 
  • Equivalent .json file
[
  {
    "name": "Testcase 01",
    "testrun_folder": "/Sprint 1",
    "testrun_status": "Pass",
    "testrun_environment": "Dev",
    "testrun_comment": "The test has passed successfully",
    "testrun_duration": 300000,
    "testrun_date": "2020-12-31T23:59:59Z"
  },
  {
    "name": "Testcase 02",
    "testrun_folder": "/Sprint 2",
    "testrun_status": "Fail",
    "testrun_environment": "test1",
    "testrun_comment": "The test has failed on some automation tool procedure",
    "testrun_duration": 30000,
    "testrun_date": "2020-12-31T23:59:59Z"
  }
]

Note:

  • Test run results are grouped into multiple cycles based on their common testrun_folder.
  • testrun_duration is measured in milliseconds.
  • testrun_environment is chosen from available environments that defined by users in the project. Do this by accessing Configuration in Tests Board on Jira.

Sample combined testcase file with test results

Both test case and test run use the same structure. So you can you a single combined the information and the result of a test into one wrapper for both operation: create tests and create cycles. This file can be generated automatically from docstrings and/or annotation/decorator in auto test code.

2.4 Sample combined testcase file with test results

  • .yaml file
- name: Testcase 01
  issueLinks: [TESTING-4]
  objective: Test 01 Objective
  precondition: Test 01 Precondition
  testScript:
    type: PLAIN_TEXT
    text: >
      - init x and y <br/>
      - call func add(x, y)
  labels: ABC, XYZ
  testrun_status: Pass
  testrun_environment: Dev
  testrun_comment: The test has passed successfully
  testrun_duration: 300000
  testrun_date: "2020-12-31T23:59:59Z" 
- name: Testcase 02
  issueLinks: [TESTING-4, TESTING-5]
  objective: Test 02 Objective
  precondition: Test 02 Precondition
  priority: Normal
  status: Draft
  testScript:
    type: STEP_BY_STEP
    steps:
    - description: <i>Step 1</i>
      testData: (x,y) = (1,3)
      expectedResult: <strong>4</strong>
    - description: Step 2
      testData: <code>(x,y) = (4,5)</code>
      expectedResult: 9
  testrun_status: Fail
  testrun_environment: test1
  testrun_comment: The test has failed on some automation tool procedure
  testrun_duration: 30000
  testrun_date: "2020-12-31T23:59:59Z"
  • Equivalent .json file
[
  {
    "name": "Testcase 01",
    "issueLinks": [
      "TESTING-4"
    ],
    "objective": "Test 01 Objective",
    "precondition": "Test 01 Precondition",
    "testScript": {
      "type": "PLAIN_TEXT",
      "text": "- init x and y <br/> - call func add(x, y)\n"
    },
    "labels": ["ABC", "XYZ"],
    "testrun_status": "Pass",
    "testrun_environment": "Dev",
    "testrun_comment": "The test has passed successfully",
    "testrun_duration": 300000,
    "testrun_date": "2020-12-31T23:59:59Z"
  },
  {
    "name": "Testcase 02",
    "issueLinks": [
      "TESTING-4",
      "TESTING-5"
    ],
    "objective": "Test 02 Objective",
    "precondition": "Test 02 Precondition",
    "priority": "Normal",
    "status": "Draft",
    "testScript": {
      "type": "STEP_BY_STEP",
      "steps": [
        {
          "description": "<i>Step 1</i>",
          "testData": "(x,y) = (1,3)",
          "expectedResult": "<strong>4</strong>"
        },
        {
          "description": "Step 2",
          "testData": "<code>(x,y) = (4,5)</code>",
          "expectedResult": "9"
        }
      ]
    },
    "testrun_status": "Fail",
    "testrun_environment": "Test1",
    "testrun_comment": "The test has failed on some automation tool procedure",
    "testrun_duration": 30000,
    "testrun_date": "2020-12-31T23:59:59Z"
  }
]

3 Using export test

3.1 Configure

  • Add to test setting pytest_plugins = ["teko.utils.jira_export_test"] . Example, add to conftest.py
  • To use decorator: from teko.utils.jira_export_test.wraper import jira_test
  • To use test script type STEP_BY_STEP: from teko.models.jira_export_test.test_step import TestStep

Env

  • $JIRA_TEST_CASE_ARTIFACT: test_case file path, default is test_case.json
  • $JIRA_TEST_CYCLE_ARTIFACT: test_cycle file path, default is test_cycle.json
  • Optional: $ENV or $ENVIRONMENT for test cycle env. Default, detect env from $CI_COMMIT_BRANCH

3.2 Write test function

  • example in /sample/jira_export_test/

Change log

[0.1.0] - 2020-12-08

Fixed

  • Initiate

[0.1.1] - 2020-12-11

Fixed

  • First MVP

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

teko-export-test-0.1.1.tar.gz (23.2 kB view hashes)

Uploaded Source

Built Distribution

teko_export_test-0.1.1-py3-none-any.whl (29.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page