Python testing framework that enhances pytest by offering utility functions and streamlined configurations. It simplifies writing, executing, and managing tests, making it easier to achieve robust and reliable testing outcomes.
Project description
pytestifypro
pytestifypro is a Python testing framework that enhances pytest by offering utility functions and streamlined configurations. It simplifies writing, executing, and managing tests, making it easier to achieve robust and reliable testing outcomes.
Features
- Enhanced Logging Utilities: Comprehensive logging functions for various levels, including info, warning, error, and critical.
- Flexible URL Formatting: Utility to dynamically format URLs with parameters, combining a base URL and endpoint.
- Schema Validation: Built-in support for JSON schema validation to ensure data integrity and compliance.
- Advanced Comparison: Recursive JSON comparison to identify discrepancies between expected and actual data.
- Priority Management: Assign priorities to JSON fields or paths, with the ability to customize priorities via YAML configuration.
- Difference Reporting: Enhanced reporting of differences in JSON comparisons, with prioritized messages.
- Configuration Management: Centralized configuration using YAML files for base URLs, endpoints, and schemas, streamlining test setups.
- Robust Test Utilities: Functions for managing retries, logging response times, and handling HTTP operations (GET, POST, PUT, DELETE).
- Docker Integration: Build and run tests within Docker containers.
- Session Management: Check Docker status and manage Docker sessions.
- Allure Reporting:
- Feature Management: Add and manage features, stories, descriptions, and severity levels in Allure reports.
- Dynamic Reporting: Generate and serve interactive Allure reports to visualize test results.
- Attachment and Step Reporting: Attach files and add steps to enhance report details.
Installation
To use pytestifypro, you need to have Python 3.12 or higher installed. You can set up the environment and install dependencies using Poetry.
-
Clone the Repository:
git clone <REPOSITORY_URL> cd <REPOSITORY_NAME>
-
Install Dependencies:
curl -sSL https://install.python-poetry.org | python3 - poetry install -
Activate the Virtual Environment:
poetry shell
Docker Requirements
Docker is required to run the tests using the provided Docker image. Make sure Docker is installed and running on your machine. If Docker is not installed, follow the installation guide for your operating system:
Starting Docker
If Docker is not running, you can start it by following these steps:
- For macOS: Open the Docker Desktop application.
- For Windows: Open Docker Desktop from the Start menu.
- For Linux: Run
sudo systemctl start dockerin the terminal.
Troubleshooting
If you encounter errors related to Docker not running, make sure Docker is properly installed and the Docker daemon is active.
Checking Docker Status
Before running Docker commands, you can check if Docker is running by executing the following Python script:
python scripts/check_docker.py
Usage
1. Writing Tests:
You can create test files under the src/pytestifypro/tests directory. Here’s a basic example:
src/pytestifypro/tests/sample_test.py
from pytestifypro.utils.utils import log_info, format_url
def test_format_url():
base_url = "http://example.com"
endpoint = "api/test"
expected = "http://example.com/api/test"
assert format_url(base_url, endpoint) == expected
def test_log_info(caplog):
log_info("Test message")
assert "INFO: Test message" in caplog.text
2. Running Tests:
- Use Poetry to run tests and generate a coverage report:
poetry run pytest
- Or, use pytest directly:
pytest --cov=src/pytestifypro --cov-report=term-missing
Running Tests with Docker
You can run the tests in a Docker container by building and running the Docker image.
Build the Docker Image
docker build -t pytestifypro:latest .
Run Tests Using Docker
docker run --rm pytestifypro:latest
Running Tests Without Docker
If you do not want to use Docker, you can run the tests directly using Poetry. Follow these steps:
-
Install Dependencies
poetry install -
Run Tests
poetry run pytest
CI/CD Pipeline Setup
This project uses Jenkins for continuous integration and continuous deployment (CI/CD). The pipeline is configured to automatically build and test the code upon each push to the main branch.
Jenkins Implementation for pytestifypro Framework
Overview
This README provides an overview of the Jenkins setup for the pytestifypro framework. Jenkins is configured to build Docker images, run tests, and handle the deployment pipeline efficiently.
Jenkins Setup
Prerequisites
- Jenkins installed and running (preferably using Docker).
- Docker installed on the Jenkins server.
- Git repository containing the pytestifypro framework.
Jenkinsfile
The Jenkinsfile is located in the root of the repository and defines the pipeline for building, testing, and deploying the application. It includes:
- Pipeline Definition: Specifies stages for building Docker images, running tests, and handling post-build actions.
- Build Stage: Builds Docker images for the
pytestifyproapplication andWireMockservice. - Test Stage: Executes tests inside the Docker containers and generates reports.
- Post-Build Actions: Archives test results and reports for review.
Key Stages in Jenkinsfile
- Build Stage
stage('Build') { steps { script { docker.build('pytestifypro-image', '-f Dockerfile .') } } }
- Test Stage
stage('Test') { steps { script { docker.image('pytestifypro-image').inside { sh 'pytest --alluredir=allure-results' allure([ results: [[path: 'allure-results']] ]) } } } }
- Post-Build Actions
stage('Post-Build') { steps { archiveArtifacts artifacts: '**/allure-results/**', allowEmptyArchive: true } }
Docker Integration
The docker-compose.yml file is used to define and run multi-container Docker applications. It includes:
- WireMock Service: Mock server for testing APIs.
- pytestifypro Service: Main application service for running tests.
Webhooks
- Webhooks should be set up in your GitHub repository to trigger the Jenkins pipeline on code changes. Ensure the webhook points to the public Jenkins URL.
Troubleshooting
- Port Conflicts: Ensure no other services are using the ports specified in docker-compose.yml.
- Dependency Issues: Verify Dockerfile dependencies and ensure they are correctly installed.
- Test Failures: Review test logs for detailed error messages and adjust test configurations as needed.
Configuration
pytest.ini Configuration
The pytest.ini file is located in the root directory and is used to configure pytest options:
[pytest]
addopts = --maxfail=5 --disable-warnings -q
testpaths =
src/pytestifypro/tests
Environment Configuration
pytestifypro supports flexible environment configurations using YAML files. This allows you to define different settings for various environments, such as development, staging, and production. Each environment can have its own base URL, WireMock URL, endpoints, and mock data.
Example Configuration (src/pytestifypro/config/config.yaml):
environments:
dev:
base_url: "https://dev.api.example.com"
wiremock_base_url: "http://localhost:8080"
endpoints:
upi_payment_status: "/upi/payment/status"
upi_payment:
success:
payload:
transactionId: "1234567890"
response:
status: "SUCCESS"
transactionId: "1234567890"
amount: "100.00"
currency: "INR"
message: "Payment successful"
upiId: "user@upi"
failure:
payload:
transactionId: "1234567891"
response:
status: "FAILURE"
transactionId: "1234567891"
amount: "100.00"
currency: "INR"
message: "Payment failed"
errorCode: "INSUFFICIENT_FUNDS"
pending:
payload:
transactionId: "1234567892"
response:
status: "PENDING"
transactionId: "1234567892"
amount: "100.00"
currency: "INR"
message: "Payment is pending"
upiId: "user@upi"
staging:
base_url: "https://staging.api.example.com"
wiremock_base_url: "http://localhost:8080"
endpoints:
upi_payment_status: "/upi/payment/status"
upi_payment:
success:
payload:
transactionId: "1234567890"
response:
status: "SUCCESS"
transactionId: "1234567890"
amount: "100.00"
currency: "INR"
message: "Payment successful"
upiId: "user@upi"
failure:
payload:
transactionId: "1234567891"
response:
status: "FAILURE"
transactionId: "1234567891"
amount: "100.00"
currency: "INR"
message: "Payment failed"
errorCode: "INSUFFICIENT_FUNDS"
pending:
payload:
transactionId: "1234567892"
response:
status: "PENDING"
transactionId: "1234567892"
amount: "100.00"
currency: "INR"
message: "Payment is pending"
upiId: "user@upi"
prod:
base_url: "https://jsonplaceholder.typicode.com"
wiremock_base_url: "http://localhost:8080"
endpoints:
upi_payment_status: "/upi/payment/status"
upi_payment:
success:
payload:
transactionId: "1234567890"
response:
status: "SUCCESS"
transactionId: "1234567890"
amount: "100.00"
currency: "INR"
message: "Payment successful"
upiId: "user@upi"
failure:
payload:
transactionId: "1234567891"
response:
status: "FAILURE"
transactionId: "1234567891"
amount: "100.00"
currency: "INR"
message: "Payment failed"
errorCode: "INSUFFICIENT_FUNDS"
pending:
payload:
transactionId: "1234567892"
response:
status: "PENDING"
transactionId: "1234567892"
amount: "100.00"
currency: "INR"
message: "Payment is pending"
upiId: "user@upi"
Schema Configuration
pytestifypro supports JSON schema validation to ensure data integrity and compliance. You can define schemas for different API responses and validate against these schemas during tests.
Example Schema Configuration (src/pytestifypro/config/schema_config.yaml):
schemas:
upi_payment_response:
type: object
properties:
status:
type: string
transactionId:
type: string
amount:
type: string
currency:
type: string
message:
type: string
upiId:
type: string
required:
- status
- transactionId
- amount
- currency
- message
Priority and Difference Management
pytestifypro now supports priority-based difference management for JSON comparisons:
Priority Manager
Configure priorities for different JSON paths via YAML files (e.g., src/pytestifypro/config/priority_map.yaml).
Example Priority Map (src/pytestifypro/config/priority_map.yaml):
priority_map:
upi_payment_status:
".status": "P1"
".transactionId": "P1"
".amount": "P2"
".currency": "P3"
".message": "P2"
Difference Reporter
Customizable reporters that output discrepancies with assigned priorities.
How to Use Configuration Files
- Define Your Configuration: Place your YAML configuration files in the src/pytestifypro/config/ directory.
- Load Configurations: pytestifypro automatically loads and applies the configurations during test execution based on the environment specified.
Allure Reporting Integration
pytestifypro supports Allure reporting to enhance the visibility and management of test results. This section provides information on how to set up and use Allure reporting within your testing framework.
Setup
-
Install Dependencies: Ensure you have the
allure-pytestpackage installed. You can add it to yourpyproject.tomlfile or install it directly:poetry add allure-pytest
-
Update conftest.py: Add a fixture to manage Allure reporting details. This fixture allows you to set features, stories, descriptions, and severity levels for your tests:
# src/pytestifypro/tests/conftest.py
import pytest
from pytestifypro.utils.allure_reporter import AllureReporter
@pytest.fixture
def test_setup():
def _setup(feature, story, description, severity):
AllureReporter.add_feature(feature)
AllureReporter.add_story(story)
AllureReporter.add_description(description)
AllureReporter.add_severity(severity)
return _setup
- Update allure_reporter.py: Implement Allure reporting functions for features, stories, descriptions, severity, attachments, and steps:
# src/pytestifypro/utils/allure_reporter.py
import allure
class AllureReporter:
@staticmethod
def add_feature(feature_name: str):
allure.dynamic.feature(feature_name)
@staticmethod
def add_story(story_name: str):
allure.dynamic.story(story_name)
@staticmethod
def add_description(description: str):
allure.dynamic.description(description)
@staticmethod
def add_severity(severity_level: str):
allure.dynamic.severity(severity_level)
@staticmethod
def attach_file(name: str, content: bytes, attachment_type=allure.attachment_type.TEXT):
allure.attach(name=name, body=content, attachment_type=attachment_type)
@staticmethod
def add_step(step_name: str):
with allure.step(step_name):
pass
Running Tests with Allure
- Run Tests: Execute your tests and generate Allure results:
Copy code
poetry run pytest --alluredir=allure-results
- Generate and View Allure Report: Use the following command to generate and serve the Allure report:
Copy code
allure serve allure-results
Contribution Guidelines:
To contribute to the development of pytestifypro, follow these steps:
- Create a New Branch:
git checkout -b feature/my-feature
- Make Your Changes: Edit code and write tests as needed.
-
- Commit Your Changes:
git add . git commit -m "Add new feature or fix bug"
- Push Your Changes:
git push origin feature/my-feature
- Create a Pull Request: Open a pull request on the repository to merge your changes.
- Commit Your Changes:
WireMock Integration
Setting Up WireMock
To test your APIs using WireMock:
-
Directory Structure:
- Place your WireMock mappings in the
wiremock/mappingsdirectory. - Place your response files in the
wiremock/__filesdirectory.
- Place your WireMock mappings in the
-
Running WireMock:
- New Addition: WireMock can be started and stopped automatically using
pytestfixtures. See conftest.py for details. - Multiple Markers Handling: You can now specify markers in your tests to indicate whether to use mock endpoints or real endpoints. The conftest.py has been updated to handle these markers and set up the correct environment.
- New Addition: WireMock can be started and stopped automatically using
-
Writing Tests:
- Mock Endpoint Test: Use the @pytest.mark.mock decorator to indicate that a test should use mock endpoints.
- Real Endpoint Test: Use the @pytest.mark.real decorator for tests that should interact with real endpoints.
Example Test
Mock End Point Example
import pytest
import requests
@pytest.mark.mock
def test_sample_response_with_mock():
response = requests.get("http://localhost:8080/api/test")
assert response.status_code == 200
assert response.json() == {"message": "Sample response from WireMock"}
Real End Point Example
import pytest
import requests
@pytest.mark.real
def test_sample_response_with_real():
response = requests.get("http://api.realendpoint.com/test")
assert response.status_code == 200
assert response.json() == {"message": "Sample response from Real Endpoint"}
Summary
- Automate Setup: Use
pytestfixtures to manage WireMock lifecycle. - Handle Multiple Environments: New Addition: Use pytest markers to switch between mock and real endpoints seamlessly.
- Enhance Coverage: Add more tests and mappings.
- CI/CD Integration: Configure your CI/CD pipeline to handle WireMock.
- Update Documentation: Provide clear instructions on using WireMock.
If you need help with any of these steps or have additional questions, let me know!
API Documentation
APIClient Class
- get(url, headers=None, params=None): Sends a GET request.
- Parameters:
- url: The URL to send the request to.
- headers: Optional headers to include in the request.
- params: Optional parameters to include in the request.
- Returns: Response object.
- Parameters:
- post(url, headers=None, data=None, json=None): Sends a POST request.
- Parameters:
- url: The URL to send the request to.
- headers: Optional headers.
- data: Optional data to send in the request body.
- json: Optional JSON to send in the request body.
- Returns: Response object.
- Parameters:
Utility Functions
- format_url(base_url, endpoint): Combines a base URL and endpoint.
- validate_json_schema(response_json, schema): Validates JSON response against a schema.
- compare_json(expected, actual, path=""): Compares two JSON objects recursively.
- load_schema(file_path='src/pytestifypro/config/schema_config.yaml'): Loads JSON schema from a YAML file.
- assert_no_differences(differences: list[str]): Asserts that there are no differences in the JSON comparison and logs differences if found.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contact
For any questions or issues, please contact qabyjavedansari@gmail.com OR connect with me over linked on www.linkedin.com/in/qaleaderjavedansari.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pytestifypro-0.1.1.tar.gz.
File metadata
- Download URL: pytestifypro-0.1.1.tar.gz
- Upload date:
- Size: 21.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2a875776b32edbdc7c25ac33fb83fa568c77f35748b085f58e9408e9ba02fc8e
|
|
| MD5 |
df1a1b326b946670a7c80a351f743526
|
|
| BLAKE2b-256 |
a0a9ee621580d7bcc0b35bf7647263b218d1d090210aeb4ccf7b718d8736d55a
|
File details
Details for the file pytestifypro-0.1.1-py3-none-any.whl.
File metadata
- Download URL: pytestifypro-0.1.1-py3-none-any.whl
- Upload date:
- Size: 22.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8dbc408812e69968f4df2c768ee824a1c6c5d9f5451c0e64de20a9554174a85f
|
|
| MD5 |
a1ad67c10294b72b6769296f1f6f728c
|
|
| BLAKE2b-256 |
ff6cc2ece2d33041727736c9d3b599aaf58cd4c691982809e560d8cbeb6861d1
|