A Kafka mock library that is designed to be used in integration tests for applications using librdkafka.
Project description
Embedded Kafka (Kafka Simulator) for Python
Embedded Kafka is a mocking library for the confluent_kafka library used for Apache Kafka. Its goal is to ease the
effort
of writing integration tests that utilize Producer and/or Consumer instances. Of course, you can always span your
own
Kafka Cluster just for testing purposes, but it is not always the best solution.
With kafka_mocha you no longer need to have a Kafka Cluster running to test your Kafka-related code. Instead, you
can use the KProducerand KConsumer (by simply decorating your code with @mock_producer/@mock_consumer) and check
the behavior of your
code - or even the messages that are being produced and consumed in the browser!
Inspiration for this project comes from the moto library, which provides a similar feature for AWS SDK.
Support me with
Project Overview
The main component of this project is a process called KafkaSimulator which simulates the behavior of an actual Kafka
Cluster, within the bounds of implementation limitations. The current version includes a KProducer class that acts as
a mock for the Producer from the confluent_kafka package. A KConsumer class is still under development.
Table of Contents
Installation
Official Release
pip install kafka_mocha
or using your favorite package manager, e.g. poetry:
poetry add kafka_mocha
Prerelease or Development Version
From GitHub (development version):
pip install git+https://github.com/Effiware/kafka-mocha@develop
or as published (prerelease) version:
poetry add kafka_mocha --allow-prereleases
Usage
Starting Kafka Simulator
Kafka Simulator is automatically ran whenever any instance of either KProdcer or KConsumer is created (e.g. via
mock_producer,
mock_consumer). So there is no need to manually start it.
Upon default logging settings a custom start-up messages might be visible:
INFO kafka_simulator > Kafka Simulator initialized
INFO ticking_thread > Buffer for KProducer(4368687344): ticking initialized
INFO buffer_handler > Buffer for KProducer(4368687344) has been primed, size: 300, timeout: 2
INFO kafka_simulator > Kafka Simulator initialized
INFO kafka_simulator > Handle producers has been primed
INFO kafka_simulator > Kafka Simulator initialized
INFO ticking_thread > Buffer for KProducer(4368687344): ticking started
Additionally, all the messages produced by the KProducer instances are stored in the KafkaSimulator instance. The
messages can be
dropped to either HTML or CSV file by passing output parameter, see KProucer and outputs for
more details.
KProducer
To use the KProducer class in your tests, you need to import it from the kafka_simulator package:
import confluent_kafka
from kafka_mocha import mock_producer
@mock_producer()
def handle_produce():
"""Most basic usage of the KProducer class. For more go to `examples` directory."""
producer = confluent_kafka.Producer({"bootstrap.servers": "localhost:9092"})
producer.produce("test-topic", "some value".encode(), "key".encode())
producer.flush()
The KProducer class replicates the interface and behavior of the Producer class from the confluent_kafka library.
Parameters for mock_producer
| No | Parameter name | Parameter type | Comment |
|---|---|---|---|
| 1 | loglevel | Literal | See available levels in logging library |
| 2 | output | dict | Dictionary with output configuration |
| 3 | output.format | Literal | html, csv or int - output format of messages emitted |
| 4 | output.name | str | Name of the output file (only for HTML), e.g. kafka-dump.html |
| 5 | output.include_internal_topics | bool | Flag to include internal topics in the output |
| 6 | output.include_markers | bool | Flag to include transaction markers in the output |
KConsumer
The KConsumer class is still under development. It will replicate the interface and behavior of the Consumer class
from the confluent_kafka library.
Parameters for mock_consumer
| No | Parameter name | Parameter type | Comment |
|---|---|---|---|
| 1 | loglevel | Literal | See available levels in logging library |
| 2 | |||
| 3 |
Contributing
We welcome contributions! Before posting your first PR, please see our contributing guidelines for more details.
Also, bear in mind that this project uses Poetry for dependency management. If you are not familiar with it, please first read the Poetry documentation and:
- Setup poetry environment (recommended)
- Don't overwrite the
pyproject.tomlfile manually (Poetry will do it for you) - Don't recreate the
poetry.lock(unless you know what you are doing)
Cloning the repository
git clone git@github.com:Effiware/kafka-mocha.git
cd kafka-mocha
Installing dependencies
Default (and recommended) way:
poetry install --with test
Standard way:
poetry export -f requirements.txt --output requirements.txt
pip install -r requirements.txt
Running tests
Currently, test configuration is set up to run with pytest and kept in pytest.ini file. You can
run them with:
poetry run pytest
License
This project is licensed under the MIT License. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kafka_mocha-0.1.0a10.tar.gz.
File metadata
- Download URL: kafka_mocha-0.1.0a10.tar.gz
- Upload date:
- Size: 46.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b1e08f9162454c78c313a9807802f1175cf67c9fbef85069d2fdc6a87172edda
|
|
| MD5 |
7f840b528d98ce0dd5ae588cb4bbdb15
|
|
| BLAKE2b-256 |
270cd8dabd754b65beacd4eaa8727e494730540ff8dd69c1e42042c97e659619
|
Provenance
The following attestation bundles were made for kafka_mocha-0.1.0a10.tar.gz:
Publisher:
pypi-publish.yml on Effiware/kafka-mocha
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
kafka_mocha-0.1.0a10.tar.gz -
Subject digest:
b1e08f9162454c78c313a9807802f1175cf67c9fbef85069d2fdc6a87172edda - Sigstore transparency entry: 170337091
- Sigstore integration time:
-
Permalink:
Effiware/kafka-mocha@36600d3a33d8d0ef50c91ad72d8f19088c4d82a3 -
Branch / Tag:
refs/tags/0.1.0a10 - Owner: https://github.com/Effiware
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@36600d3a33d8d0ef50c91ad72d8f19088c4d82a3 -
Trigger Event:
release
-
Statement type:
File details
Details for the file kafka_mocha-0.1.0a10-py3-none-any.whl.
File metadata
- Download URL: kafka_mocha-0.1.0a10-py3-none-any.whl
- Upload date:
- Size: 47.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9a212f942107242bceea3a56a34027a878f16d1dc472857b032b282ad9738e1d
|
|
| MD5 |
058efd1ef54ab75a3bd8f9eecd07ddc0
|
|
| BLAKE2b-256 |
f46fcfd5cf35f33a467de54aa8da98e44ded09516b7b77f43d0d7fc5bbcb8c80
|
Provenance
The following attestation bundles were made for kafka_mocha-0.1.0a10-py3-none-any.whl:
Publisher:
pypi-publish.yml on Effiware/kafka-mocha
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
kafka_mocha-0.1.0a10-py3-none-any.whl -
Subject digest:
9a212f942107242bceea3a56a34027a878f16d1dc472857b032b282ad9738e1d - Sigstore transparency entry: 170337092
- Sigstore integration time:
-
Permalink:
Effiware/kafka-mocha@36600d3a33d8d0ef50c91ad72d8f19088c4d82a3 -
Branch / Tag:
refs/tags/0.1.0a10 - Owner: https://github.com/Effiware
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@36600d3a33d8d0ef50c91ad72d8f19088c4d82a3 -
Trigger Event:
release
-
Statement type: