Skip to main content

A Kafka mock library that is designed to be used in integration tests for applications using librdkafka.

Project description

Embedded Kafka (Kafka Simulator) for Python

PyPI PyPI - Python Version PyPI - License PyPI - Downloads PyPI - Coverage PyPI - Wheel PyPI - Implementation

Embedded Kafka is a mocking library for the confluent_kafka library used for Apache Kafka. Its goal is to ease the effort of writing integration tests that utilize Producer and/or Consumer instances. Of course, you can always span your own Kafka Cluster just for testing purposes, but it is not always the best solution.

With kafka_mocha you no longer need to have a Kafka Cluster running to test your Kafka-related code. Instead, you can use the KProducerand KConsumer (by simply decorating your code with @mock_producer/@mock_consumer) and check the behavior of your code - or even the messages that are being produced and consumed in the browser!

Inspiration for this project comes from the moto library, which provides a similar feature for AWS SDK.

Support me with

Buy Me A Coffee

Project Overview

The main component of this project is a process called KafkaSimulator which simulates the behavior of an actual Kafka Cluster, within the bounds of implementation limitations. The current version includes a KProducer class that acts as a mock for the Producer from the confluent_kafka package. A KConsumer class is still under development.

Table of Contents

Installation

Official Release
pip install kafka_mocha

or using your favorite package manager, e.g. poetry:

poetry add kafka_mocha

Prerelease or Development Version

From GitHub (development version):

pip install git+https://github.com/Effiware/kafka-mocha@develop

or as published (prerelease) version:

poetry add kafka_mocha --allow-prereleases

Usage

Starting Kafka Simulator

Kafka Simulator is automatically ran whenever any instance of either KProdcer or KConsumer is created (e.g. via mock_producer, mock_consumer). So there is no need to manually start it.

Upon default logging settings a custom start-up messages might be visible:

INFO     kafka_simulator > Kafka Simulator initialized
INFO     ticking_thread  > Buffer for KProducer(4368687344): ticking initialized
INFO     buffer_handler  > Buffer for KProducer(4368687344) has been primed, size: 300, timeout: 2
INFO     kafka_simulator > Kafka Simulator initialized
INFO     kafka_simulator > Handle producers has been primed
INFO     kafka_simulator > Kafka Simulator initialized
INFO     ticking_thread  > Buffer for KProducer(4368687344): ticking started

Additionally, all the messages produced by the KProducer instances are stored in the KafkaSimulator instance. The messages can be dropped to either HTML or CSV file by passing output parameter, see KProucer and outputs for more details.

KProducer

To use the KProducer class in your tests, you need to import it from the kafka_simulator package:

import confluent_kafka

from kafka_mocha import mock_producer


@mock_producer()
def handle_produce():
    """Most basic usage of the KProducer class. For more go to `examples` directory."""
    producer = confluent_kafka.Producer({"bootstrap.servers": "localhost:9092"})
    producer.produce("test-topic", "some value".encode(), "key".encode())
    producer.flush()

The KProducer class replicates the interface and behavior of the Producer class from the confluent_kafka library.

Parameters for mock_producer
No Parameter name Parameter type Comment
1 loglevel Literal See available levels in logging library
2 output dict Dictionary with output configuration
3 output.format Literal html, csv or int - output format of messages emitted
4 output.name str Name of the output file (only for HTML), e.g. kafka-dump.html
5 output.include_internal_topics bool Flag to include internal topics in the output
6 output.include_markers bool Flag to include transaction markers in the output

KConsumer

The KConsumer class is still under development. It will replicate the interface and behavior of the Consumer class from the confluent_kafka library.

Parameters for mock_consumer
No Parameter name Parameter type Comment
1 loglevel Literal See available levels in logging library
2
3

Contributing

We welcome contributions! Before posting your first PR, please see our contributing guidelines for more details.

Also, bear in mind that this project uses Poetry for dependency management. If you are not familiar with it, please first read the Poetry documentation and:

  1. Setup poetry environment (recommended)
  2. Don't overwrite the pyproject.toml file manually (Poetry will do it for you)
  3. Don't recreate the poetry.lock (unless you know what you are doing)
Cloning the repository
git clone git@github.com:Effiware/kafka-mocha.git
cd kafka-mocha

Installing dependencies

Default (and recommended) way:

poetry install --with test

Standard way:

poetry export -f requirements.txt --output requirements.txt
pip install -r requirements.txt

Running tests

Currently, test configuration is set up to run with pytest and kept in pytest.ini file. You can run them with:

poetry run pytest

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kafka_mocha-0.1.0a10.tar.gz (46.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kafka_mocha-0.1.0a10-py3-none-any.whl (47.2 kB view details)

Uploaded Python 3

File details

Details for the file kafka_mocha-0.1.0a10.tar.gz.

File metadata

  • Download URL: kafka_mocha-0.1.0a10.tar.gz
  • Upload date:
  • Size: 46.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for kafka_mocha-0.1.0a10.tar.gz
Algorithm Hash digest
SHA256 b1e08f9162454c78c313a9807802f1175cf67c9fbef85069d2fdc6a87172edda
MD5 7f840b528d98ce0dd5ae588cb4bbdb15
BLAKE2b-256 270cd8dabd754b65beacd4eaa8727e494730540ff8dd69c1e42042c97e659619

See more details on using hashes here.

Provenance

The following attestation bundles were made for kafka_mocha-0.1.0a10.tar.gz:

Publisher: pypi-publish.yml on Effiware/kafka-mocha

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kafka_mocha-0.1.0a10-py3-none-any.whl.

File metadata

  • Download URL: kafka_mocha-0.1.0a10-py3-none-any.whl
  • Upload date:
  • Size: 47.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for kafka_mocha-0.1.0a10-py3-none-any.whl
Algorithm Hash digest
SHA256 9a212f942107242bceea3a56a34027a878f16d1dc472857b032b282ad9738e1d
MD5 058efd1ef54ab75a3bd8f9eecd07ddc0
BLAKE2b-256 f46fcfd5cf35f33a467de54aa8da98e44ded09516b7b77f43d0d7fc5bbcb8c80

See more details on using hashes here.

Provenance

The following attestation bundles were made for kafka_mocha-0.1.0a10-py3-none-any.whl:

Publisher: pypi-publish.yml on Effiware/kafka-mocha

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page