Skip to main content

Kafka resource for Tamarco microservice framework.

Project description


Build Status Coverage Quality Gate Status

Kafka resource for Tamarco microservice framework. It runs a confluent-kafka client in a thread.

This repository is a plugin for Tamarco, for more information go to Tamarco main repository.


This resource depends on the following configuration:

                bootstrap_servers: kafka:9092

The bootstrap servers are the address of the members of a kafka cluster separated by coma.

Input and outputs

The inputs and outputs need to be declared in the resource.


The input can be used with two different patterns, as decorator and as async stream.

This resource only supports balanced consumer groups with auto commit.

Async stream

This usage case uses the input as asynchronous iterator to consume the metric stream.

class MyMicroservice(Microservice):
    name = "input_example"

    metrics_input = KafkaInput(topic='metrics', codec=JsonCodec)
    kafka = KafkaResource(inputs=[metrics_input])

    async def metrics_consumer(self):
        async for metric in self.metrics_input:
  'Consumed message from metrics topic: {metric}')


This usage case declares a function as handler of the messages, and the resource is going to open automatically a coroutine to consume each message.

class MyMicroservice(Microservice):
    name = "input_example"

    kafka = KafkaResource(inputs=[metrics_input])

    @KafkaInput(resource=kafka, topic='metrics', codec=JsonCodec)
    async def metrics_handler(self, message):'Consumed message from metrics topic: {message}')


It is a Kafka producer very simple to use.

class MyMicroservice(Microservice):
    name = "output_example"
    metrics_output = KafkaOutput(topic='metrics', codec=JsonCodec)
    kafka = KafkaResource(outputs=[metrics_output])

    @task_timer(interval=1000, autostart=True)
    async def metrics_producer(self):
        metrics_message = {'metrics': {'cat': 'MEOW'}}
        await self.metrics_output.push(metrics_message)'Produced message {metrics_message} to metrics topic')

How to run the examples

To run them you just need to launch the docker-compose, install the requirements and run it.

pip install -r examples/requirements.txt
docker-compose up -d
python examples/

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tamarco-kafka-0.1.0.tar.gz (15.5 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page