Skip to main content

Microservices automated test tool.

Project description

https://github.com/comtihon/catcher/actions/workflows/test-package.yml/badge.svg?branch=master https://img.shields.io/pypi/v/catcher.svg https://img.shields.io/pypi/pyversions/catcher.svg https://img.shields.io/pypi/wheel/catcher.svg https://patrolavia.github.io/telegram-badge/chat.png

Microservices automated test tool

Support your team with a good Catcher!

https://raw.githubusercontent.com/comtihon/catcher/master/docs/_static/logo_big.png

What is catcher?

Catcher is a flexible end to end test tool, that can be used for automated microservices or data pipelines testing. It helps you to check either one service or whole system interaction from the front-end to the back-end. With the help of Catcher you can easily mock external services your system relies on. Catcher is not about only http, it can check different services, such as Kafka, Postgres, CouchBase, Mongodb, Elastic, S3, emails and others.

Quickstart and documentation

  1. Check, how to write a test.

  2. Get to know how to install and run Catcher.

  3. List all steps and select those you need.

  4. Learn more about variables and resources

  5. Read how to trace and debug your test using reports

For more information check readthedocs.

Very quick start

You can run Catcher in docker with all libraries, drivers and steps already installed and configured. It allows you to try Catcher without installing anything.

Just run the minimal command:

docker run -v $(pwd)/inventory:/opt/catcher/inventory
           -v $(pwd)/tests:/opt/catcher/tests
            catcher -i inventory/my_inventory.yml tests

It will ask Catcher to run everything within your local tests folder using inventory/my_inventory.yml. For more information please check run instructions

How does it look like?

Imagine you have a user_service which saves users in postgre and posts them to kafka topic, where they are consumed by another service, which sends them emails. You mention your environment in the inventory files.

local.yml:

kafka_server: '127.0.0.1:9092'
postgres: 'test:test@localhost:5433/test'
user_service: 'http://127.0.0.1:9090'
email_config:
    host: '127.0.0.1:12345'
    user: 'local_user'
    pass: 'local_pass'

develop.yml:

kafka_server: 'kafka.dev.mycompany.com:9092'
postgres: 'dev:dev@postgres.dev.mycompany.com:5433/test'
user_service: 'http://user_service.dev.mycompany.com:9090'
email_config:
    host: 'imap.google.com'
    user: 'my_user@google.com'
    pass: 'my_pass'

You write a test:

variables: # here you specify test-local variables
    users:
        - email: '{{ test_user@my_company.com }}'
          name: '{{ random("name") }}' # templates fully supported
        - email: '{{ random("email") }}'
          name: '{{ random("name") }}'
steps: # here you write steps which Catcher executed one by one until it fails
    - http:
        post:
            url: '{{ user_service }}/sign_up' # user_service value is taken from active inventory which you specify at runtime
            body: '{{ users[0] }}' # send first user from variables as a POST body
            headers: {Content-Type: 'application/json'}
            response_code: 2xx # will accept 200-299 codes
        name: 'Register {{ users[0].email }} as a new user' # name your step properly (Optional)
        register: {user_id: '{{ OUTPUT.id }}'}  # register new variable user_id as id param from json response
    - postgres: # check if user was saved in the database
        request:
            conf: '{{ postgres }}'
            sql: 'select * from users where user_id = {{ user_id }}'  # user_id from previous step will be used in this template
        register: {email_in_db: '{{ OUTPUT.email }}'}  # load full user data and register only email as a new variable
    - check: # compare email from the database with real user email
        equals: {the: '{{ users[0].email }}', is: '{{ email_id_db }}'}}  # checks the equality between two strings. Templates supported.
    - kafka:
        consume:  # check if user_service pushed newly created user to kafka
            server: '{{ kafka_server }}' # kafka_server value is taken from active inventory
            topic: 'new_users'
            where: # filter all messages except messages for our user
                equals: {the: '{{ MESSAGE.user_id }}', is: '{{ user_id }}'}
    - email: # check if email was sent for this user
          receive:
              config: '{{ email_conf }}'
              filter: {unread: true, subject: 'Welcome {{ users[0].name }}'} # select all unread and filter by subject
              ack: true  # mark as read
              limit: 1
          register: {messages: '{{ OUTPUT }}'}  # register all messages found (0 or 1)
    - check: '{{ messages |length > 0 }}' # short form of compare - we should have more than 0 messages co pass this step
finally:
    - postgres: # delete user from database to cleanup after test finishes (no matter successfully or not)
        request:
            conf: '{{ postgres }}'
            sql: 'delete from users where user_id = {{ user_id }}'

For local environment run it as:

catcher -i inventories/local.yml tests/my_test.yml

For dev:

catcher -i inventories/develop.yml tests/my_test.yml

See microservices for more complex example.

Customization

Catcher can be easily customized to serve your needs.

  1. You can write your own functions and filters and use them in your step’s templates.

  2. You can create your own modules (in Python, Java, Kotlin, JS, jar-files or any executable)

  3. You can write your steps in catcher itself and include them from other tests.

Why catcher?

  • don’t repeat test code. Write one test and call its steps from another;

  • compute and override variables to check your data and compose new flexible requests;

  • write test for development, change inventory and test stage/uat/prod with no changes;

  • test your data pipelines with Airflow step;

  • test your front-end <-> back-end integration with Selenium step;

  • test all your microservices with ease;

  • modular architecture

  • bulk-prepare and bulk-check data for you tests with prepare-expect step

  • automate your testing!

Changelog is here.

Contributors:

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

catcher-1.36.2.tar.gz (127.0 kB view details)

Uploaded Source

Built Distribution

catcher-1.36.2-py3-none-any.whl (150.1 kB view details)

Uploaded Python 3

File details

Details for the file catcher-1.36.2.tar.gz.

File metadata

  • Download URL: catcher-1.36.2.tar.gz
  • Upload date:
  • Size: 127.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for catcher-1.36.2.tar.gz
Algorithm Hash digest
SHA256 f54b1ad7170fb714a172a93a6c8c69867af25771d34430b75ddf7f1b5590685e
MD5 7cfb9b58c94eee67c29621c8936cb761
BLAKE2b-256 b129b0ff610ad4339cca5830913381c0bf3f13adccd71495a213da5d8140b157

See more details on using hashes here.

File details

Details for the file catcher-1.36.2-py3-none-any.whl.

File metadata

  • Download URL: catcher-1.36.2-py3-none-any.whl
  • Upload date:
  • Size: 150.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for catcher-1.36.2-py3-none-any.whl
Algorithm Hash digest
SHA256 73582a0b91697d6b61368d844ca717536c63a7496e288f2df8e2b7aa646d7d98
MD5 2bfbb2c09eba924112e683e4088edaa7
BLAKE2b-256 943835a718665b1577d6f5dcd8f3223a5dbc96fde7bcf12401f5d0fe18ba5ec4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page