FastKafka is a powerful and easy-to-use Python library for building asynchronous web services that interact with Kafka topics. Built on top of FastAPI, Starlette, Pydantic, AIOKafka and AsyncAPI, FastKafka simplifies the process of writing producers and consumers for Kafka topics.
Project description
FastKafka
Effortless Kafka integration for your web services
FastKafka is a powerful and easy-to-use Python library for building asynchronous web services that interact with Kafka topics. Built on top of FastAPI, Starlette, Pydantic, AIOKafka and AsyncAPI, FastKafka simplifies the process of writing producers and consumers for Kafka topics, handling all the parsing, networking, task scheduling and data generation automatically. With FastKafka, you can quickly prototype and develop high-performance Kafka-based services with minimal code, making it an ideal choice for developers looking to streamline their workflow and accelerate their projects.
Install
FastKafka works on macOS, Linux, and most Unix-style operating systems.
You can install it with pip
as usual:
pip install fastkafka
Writing server code
Here is an example python script using FastKafka that takes data from a Kafka topic, makes a prediction using a predictive model, and outputs the prediction to another Kafka topic.
Messages
FastKafka uses Pydantic to parse input
JSON-encoded data into Python objects, making it easy to work with
structured data in your Kafka-based applications. Pydantic’s
BaseModel
class allows you
to define messages using a declarative syntax, making it easy to specify
the fields and types of your messages.
This example defines two message classes for use in a FastKafka application:
-
The
InputData
class is used to represent input data for a predictive model. It has three fields:user_id
,feature_1
, andfeature_2
. Theuser_id
field is of typeNonNegativeInt
, which is a subclass of int that only allows non-negative integers. Thefeature_1
andfeature_2
fields are both lists of floating-point numbers and integers, respectively. -
The
Prediction
class is used to represent the output of the predictive model. It has two fields:user_id
andscore
. Thescore
field is a floating-point number and it represents the prediction made by the model, such as the probability of churn in the next 28 days.
These message classes will be used to parse and validate incoming data in Kafka consumers and producers.
from typing import List
from pydantic import BaseModel, Field, NonNegativeInt
class InputData(BaseModel):
user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
feature_1: List[float] = Field(
...,
example=[1.2, 2.3, 4.5, 6.7, 0.1],
description="input feature 1",
)
feature_2: List[int] = Field(
...,
example=[2, 4, 3, 1, 0],
description="input feature 2",
)
class Prediction(BaseModel):
user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
score: float = Field(
...,
example=0.4321,
description="Prediction score (e.g. the probability of churn in the next 28 days)",
ge=0.0,
le=1.0,
)
These message classes will be used to parse and validate incoming data in a Kafka consumer and to produce a JSON-encoded message in a producer. Using Pydantic’s BaseModel in combination with FastKafka makes it easy to work with structured data in your Kafka-based applications.
Application
This example shows how to initialize a FastKafka application.
It starts by defining a dictionary called kafka_brokers
, which
contains two entries: "localhost"
and "production"
, specifying local
development and production Kafka brokers. Each entry specifies the URL,
port, and other details of a Kafka broker. This dictionary is used for
generating the documentation only and it is not being checked by the
actual server.
Next, an object of the FastAPI
class is created. It role is to serve
the documentation and to start and shutdown
FastKafka
.
Finally, an object of the
FastKafka
class is initialized with the minimum set of arguments:
-
app
: anFastAPI
application used for serving the documentation and starting/shutting down the service -
kafka_brokers
: a dictionary used for generation of documentation -
bootstrap_servers
: ahost[:port]
string or list ofhost[:port]
strings that a consumer or a producer should contact to bootstrap initial cluster metadata
from os import environ
from fastapi import FastAPI
from fastkafka.application import FastKafka
kafka_brokers = {
"localhost": {
"url": "localhost",
"description": "local development kafka broker",
"port": 9092,
},
"production": {
"url": "kafka.airt.ai",
"description": "production kafka broker",
"port": 9092,
"protocol": "kafka-secure",
"security": {"type": "plain"},
},
}
app = FastAPI(
title="FastKafka Example",
contact={"name": "airt.ai", "url": "https://airt.ai", "email": "info@airt.ai"},
version="0.0.1",
description="A simple example on how to use FastKafka",
)
bootstrap_servers = f"{environ['KAFKA_HOSTNAME']}:{environ['KAFKA_PORT']}"
kafka_app = FastKafka(
app,
kafka_brokers=kafka_brokers,
bootstrap_servers=bootstrap_servers,
)
Function decorators
FastKafka provides convenient function decorators @kafka_app.consumes
and @kafka_app.produces
to allow you to delegate the actual process of
-
consuming and producing data to Kafka, and
-
decoding and encoding JSON encode messages
from user defined functions to the framework. The FastKafka framework delegates these jobs to AIOKafka and Pydantic libraries.
These decorators make it easy to specify the processing logic for your Kafka consumers and producers, allowing you to focus on the core business logic of your application without worrying about the underlying Kafka integration.
This following example shows how to use the @kafka_app.consumes
and
@kafka_app.produces
decorators in a FastKafka application:
-
The
@kafka_app.consumes
decorator is applied to theon_input_data
function, which specifies that this function should be called whenever a message is received on the “input_data” Kafka topic. Theon_input_data
function takes a single argument which is expected to be an instance of theInputData
message class. Specifying the type of the single argument is instructing the Pydantic to useInputData.parse_raw()
on the consumed message before passing it to the user defined functionon_input_data
. -
The
@produces
decorator is applied to theto_predictions
function, which specifies that this function should produce a message to the “predictions” Kafka topic whenever it is called. Theto_predictions
function takes two arguments:user_id
andscore
. It creates a newPrediction
message with these values and then returns it. The framework will call thePrediction.json().encode("utf-8")
function on the returned value and produce it to the specified topic.
@kafka_app.consumes(topic="input_data", auto_offset_reset="latest", group_id="my_group")
async def on_input_data(msg: InputData):
global model
score = await model.predict(feature_1=msg.feature_1, feature_2=msg.feature_2)
await to_predictions(user_id=msg.user_id, score=score)
@kafka_app.produces(topic="predictions")
async def to_predictions(user_id: int, score: float) -> Prediction:
prediction = Prediction(user_id=user_id, score=score)
return prediction
# this is a mock up for testing, should be replaced with the real model
class Model:
async def predict(self, feature_1: List[int], feature_2: List[float]) -> float:
return 0.87
model = Model()
Running the service
The service can be started using any ASGI web server such as Uvicorn in the same way as starting the FastAPI application.
Starting the service using Uvicorn
This example shows how to start the FastKafka service using
Uvicorn. We will concatenate the code
snippets from above and save them in a file "server.py"
??? Example
This example contains the content of the file "server.py":
```python
from typing import List
from pydantic import BaseModel, Field, NonNegativeInt
class InputData(BaseModel):
user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
feature_1: List[float] = Field(
...,
example=[1.2, 2.3, 4.5, 6.7, 0.1],
description="input feature 1",
)
feature_2: List[int] = Field(
...,
example=[2, 4, 3, 1, 0],
description="input feature 2",
)
class Prediction(BaseModel):
user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
score: float = Field(
...,
example=0.4321,
description="Prediction score (e.g. the probability of churn in the next 28 days)",
ge=0.0,
le=1.0,
)
from os import environ
from fastapi import FastAPI
from fastkafka.application import FastKafka
kafka_brokers = {
"localhost": {
"url": "localhost",
"description": "local development kafka broker",
"port": 9092,
},
"production": {
"url": "kafka.airt.ai",
"description": "production kafka broker",
"port": 9092,
"protocol": "kafka-secure",
"security": {"type": "plain"},
},
}
app = FastAPI(
title="FastKafka Example",
contact={"name": "airt.ai", "url": "https://airt.ai", "email": "info@airt.ai"},
version="0.0.1",
description="A simple example on how to use FastKafka",
)
bootstrap_servers = f"{environ['KAFKA_HOSTNAME']}:{environ['KAFKA_PORT']}"
kafka_app = FastKafka(
app,
kafka_brokers=kafka_brokers,
bootstrap_servers=bootstrap_servers,
)
@kafka_app.consumes(topic="input_data", auto_offset_reset="latest", group_id="my_group")
async def on_input_data(msg: InputData):
global model
score = await model.predict(feature_1=msg.feature_1, feature_2=msg.feature_2)
await to_predictions(user_id=msg.user_id, score=score)
@kafka_app.produces(topic="predictions")
async def to_predictions(user_id: int, score: float) -> Prediction:
prediction = Prediction(user_id=user_id, score=score)
return prediction
# this is a mock up for testing, should be replaced with the real model
class Model:
async def predict(self, feature_1: List[int], feature_2: List[float]) -> float:
return 0.87
model = Model()
```
We start by generating the documentation for the FastKafka service by running the following command:
fastkafka generate-docs server:kafka_app
Next, we start the Uvicorn server by running:
uvicorn server:app
The meaning of the server:app
notation is to import the symbol app
from the module "server"
, typically from a file named "server.py"
.
[INFO] fastkafka.testing: Generating docs for: server:kafka_app
[INFO] fastkafka._components.asyncapi: Old async specifications at '/tmp/tmp0ns3f279/asyncapi/spec/asyncapi.yml' does not exist.
[INFO] fastkafka._components.asyncapi: New async specifications generated at: '/tmp/tmp0ns3f279/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Async docs generated at 'asyncapi/docs'
[INFO] fastkafka._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'npm WARN deprecated har-validator@5.1.5: this library is no longer supported
npm WARN deprecated uuid@3.4.0: Please upgrade to version 7 or higher. Older versions may use Math.random() in certain circumstances, which is known to be problematic. See https://v8.dev/blog/math-random for details.
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
npm WARN deprecated readdir-scoped-modules@1.1.0: This functionality has been moved to @npmcli/fs
npm WARN deprecated @npmcli/move-file@1.1.2: This functionality has been moved to @npmcli/fs
npm WARN deprecated mkdirp@0.3.5: Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x. (Note that the API surface has changed to use Promises in 1.x.)
npm WARN deprecated mkdirp@0.3.5: Legacy versions of mkdirp are no longer supported. Please update to mkdirp 1.x. (Note that the API surface has changed to use Promises in 1.x.)
Done! ✨
Check out your shiny new generated files at /tmp/tmp0ns3f279/asyncapi/docs.
npm notice
npm notice New minor version of npm available! 9.2.0 -> 9.3.1
npm notice Changelog: <https://github.com/npm/cli/releases/tag/v9.3.1>
npm notice Run `npm install -g npm@9.3.1` to update!
npm notice
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started parent process [2367]
INFO: Started server process [2369]
INFO: Waiting for application startup.
INFO: Started server process [2370]
INFO: Waiting for application startup.
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmp0ns3f279/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmp0ns3f279/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmp0ns3f279/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmp0ns3f279/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[ERROR] aiokafka.consumer.group_coordinator: Group Coordinator Request failed: [Error 15] CoordinatorNotAvailableError
[ERROR] aiokafka.cluster: Topic input_data not found in cluster metadata
[ERROR] aiokafka.consumer.group_coordinator: Group Coordinator Request failed: [Error 15] CoordinatorNotAvailableError
[WARNING] aiokafka.cluster: Topic input_data is not available during auto-create initialization
[ERROR] aiokafka.consumer.group_coordinator: Group Coordinator Request failed: [Error 15] CoordinatorNotAvailableError
[WARNING] aiokafka.cluster: Topic input_data is not available during auto-create initialization
[ERROR] aiokafka.consumer.group_coordinator: Group Coordinator Request failed: [Error 15] CoordinatorNotAvailableError
[WARNING] aiokafka.cluster: Topic input_data is not available during auto-create initialization
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 1) with member_id aiokafka-0.8.0-b6b4b108-bd0d-4759-bac6-3115811deff9
[INFO] aiokafka.consumer.group_coordinator: Elected group leader -- performing partition assignments using roundrobin
[WARNING] kafka.coordinator.assignors.roundrobin: No partition metadata for topic input_data
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 2) with member_id aiokafka-0.8.0-b6b4b108-bd0d-4759-bac6-3115811deff9
[INFO] aiokafka.consumer.group_coordinator: Elected group leader -- performing partition assignments using roundrobin
[WARNING] kafka.coordinator.assignors.roundrobin: No partition metadata for topic input_data
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 2) with member_id aiokafka-0.8.0-4172ff84-e85a-4f92-ac86-763696c34e7c
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 2
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 2
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
INFO: Shutting down
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2369]
INFO: Shutting down
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2370]
INFO: Stopping parent process [2367]
When the service is started, several log messages are printed to the console, including information about the application startup, AsyncAPI specification generation, and consumer loop status.
During the lifetime of the service, incoming requests will be processed
by the FastKafka application and appropriate actions will be taken based
on the defined Kafka consumers and producers. For example, if a message
is received on the “input_data” Kafka topic, the on_input_data
function will be called to process the message, and if the
to_predictions
function is called, it will produce a message to the
“predictions” Kafka topic. The service will continue to run until it is
shut down, at which point the application shutdown process will be
initiated and the service will stop.
Checking out the documentation
FastKafka automatically generates documentation from functions decorated
with @kafka_app.consumes
and @kafka_app.produces
decorators using
AsyncAPI
and then uses FastAPI
for serving it. When started with
Uvicorn in the example above, you will see the following lines logging
the generation of the docs:
Done! ✨
Check out your shiny new generated files at //tmp/tmp********/asyncapi/docs.
and for starting the web service for serving them:
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Clicking on the link above redirects to the following url:
http://127.0.0.1:8000/asyncapi/index.html
and opens the web page with the documentation.
The first section of the documentation relates to servers and it is
generated from the kafka_brokers
parameter passed to the constructor
of
FastKafka
object:
{'localhost': {'description': 'local development kafka broker',
'port': 9092,
'url': 'localhost'},
'production': {'description': 'production kafka broker',
'port': 9092,
'protocol': 'kafka-secure',
'security': {'type': 'plain'},
'url': 'kafka.airt.ai'}}
The generated documentation is as follows:
Next, you can see the documentation generated from the @consumes
decorator when used on the function on_input_data
with a single
parameter of type InputData
:
class InputData(BaseModel):
user_id: NonNegativeInt = Field(..., example=202020, description="ID of a user")
feature_1: List[float] = Field(
...,
example=[1.2, 2.3, 4.5, 6.7, 0.1],
description="input feature 1",
)
feature_2: List[int] = Field(
...,
example=[2, 4, 3, 1, 0],
description="input feature 2",
)
@kafka_app.consumes(topic="input_data", auto_offset_reset="latest", group_id="my_group")
async def on_input_data(msg: InputData):
global model
score = await model.predict(feature_1=msg.feature_1, feature_2=msg.feature_2)
await to_predictions(user_id=msg.user_id, score=score)
The resulting documentation is generated as follows:
Testing the service
from os import environ
import anyio
import asyncer
from tqdm.notebook import tqdm, trange
from fastkafka.helpers import (
consumes_messages,
create_missing_topics,
produce_messages,
wait_for_get_url,
)
bootstrap_servers = f"{environ['KAFKA_HOSTNAME']}:{environ['KAFKA_PORT']}"
create_missing_topics(
["input_data", "predictions"],
bootstrap_servers=bootstrap_servers,
)
[INFO] fastkafka.helpers: create_missing_topics(['input_data', 'predictions']): new_topics = [NewTopic(topic=predictions,num_partitions=3)]
msgs = [
dict(user_id=i, feature_1=[(i / 1_000) ** 2], feature_2=[i % 177])
for i in trange(100_000, desc="generating messages")
]
async with asyncer.create_task_group() as tg:
tg.soonify(run_on_uvicorn)(script, cancel_after=45, workers=4)
await wait_for_get_url(
"http://127.0.0.1:8000", timeout=30, desc="waiting for uvicorn"
)
tg.soonify(consumes_messages)(
msgs_count=len(msgs), topic="predictions", bootstrap_servers=bootstrap_servers
)
await anyio.sleep(2)
tg.soonify(produce_messages)(
msgs=msgs, topic="input_data", bootstrap_servers=bootstrap_servers
)
generating messages: 0%| | 0/100000 [00:00<?, ?it/s]
waiting for uvicorn: 0%| | 0/30 [00:00<?, ?it/s]
[INFO] fastkafka.testing: Generating docs for: server:kafka_app
[INFO] fastkafka._components.asyncapi: Old async specifications at '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml' does not exist.
[INFO] fastkafka._components.asyncapi: New async specifications generated at: '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Async docs generated at 'asyncapi/docs'
[INFO] fastkafka._components.asyncapi: Output of '$ npx -y -p @asyncapi/generator ag asyncapi/spec/asyncapi.yml @asyncapi/html-template -o asyncapi/docs --force-write'
Done! ✨
Check out your shiny new generated files at /tmp/tmpt3t46s4d/asyncapi/docs.
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'predictions'})
[INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'predictions': 3}.
consuming from 'predictions': 0%| | 0/100000 [00:00<?, ?it/s]
producing to 'input_data': 0%| | 0/100000 [00:00<?, ?it/s]
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started parent process [2407]
INFO: Started server process [2410]
INFO: Waiting for application startup.
INFO: Started server process [2412]
INFO: Waiting for application startup.
INFO: Started server process [2409]
INFO: Waiting for application startup.
INFO: Started server process [2411]
INFO: Waiting for application startup.
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmpt3t46s4d/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmpt3t46s4d/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmpt3t46s4d/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.asyncapi: Keeping the old async specifications at: '/tmp/tmpt3t46s4d/asyncapi/spec/asyncapi.yml'
[INFO] fastkafka._components.asyncapi: Skipping generating async documentation in '/tmp/tmpt3t46s4d/asyncapi/docs'
[INFO] fastkafka.application: _create_producer() : created producer using the config: '{'bootstrap_servers': 'davor-fastkafka-kafka-1:9092'}'
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': 'davor-fastkafka-kafka-1:9092', 'auto_offset_reset': 'latest', 'max_poll_records': 100, 'group_id': 'my_group'}
INFO: Application startup complete.
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 4) with member_id aiokafka-0.8.0-f6481c0e-b301-4ed6-9532-ee2e34977057
[INFO] aiokafka.consumer.group_coordinator: Elected group leader -- performing partition assignments using roundrobin
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 4
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions {TopicPartition(topic='input_data', partition=0)} for group my_group
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'input_data'})
[INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'input_data'}
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Discovered coordinator 1003 for group my_group
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
INFO: 127.0.0.1:47648 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:47648 - "GET /asyncapi HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:47648 - "GET /index.html HTTP/1.1" 200 OK
[WARNING] aiokafka.consumer.group_coordinator: Heartbeat failed for group my_group because it is rebalancing
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions frozenset({TopicPartition(topic='input_data', partition=0)}) for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 5) with member_id aiokafka-0.8.0-f6481c0e-b301-4ed6-9532-ee2e34977057
[INFO] aiokafka.consumer.group_coordinator: Elected group leader -- performing partition assignments using roundrobin
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 5) with member_id aiokafka-0.8.0-97685270-e1f6-4247-91e0-6b00c42919f8
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 5) with member_id aiokafka-0.8.0-53ab1c22-d188-4aca-bfeb-dad1574be34f
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 5) with member_id aiokafka-0.8.0-176f0cae-7bad-4620-9955-b182b1f0804f
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 5
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 5
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 5
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 5
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions {TopicPartition(topic='input_data', partition=0)} for group my_group
INFO: Shutting down
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2409]
INFO: Shutting down
[WARNING] aiokafka.consumer.group_coordinator: Heartbeat failed for group my_group because it is rebalancing
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions frozenset({TopicPartition(topic='input_data', partition=0)}) for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[WARNING] aiokafka.consumer.group_coordinator: Heartbeat failed for group my_group because it is rebalancing
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions frozenset() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[WARNING] aiokafka.consumer.group_coordinator: Heartbeat failed for group my_group because it is rebalancing
[INFO] aiokafka.consumer.group_coordinator: Revoking previously assigned partitions frozenset() for group my_group
[INFO] aiokafka.consumer.group_coordinator: (Re-)joining group my_group
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 6) with member_id aiokafka-0.8.0-f6481c0e-b301-4ed6-9532-ee2e34977057
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 6) with member_id aiokafka-0.8.0-97685270-e1f6-4247-91e0-6b00c42919f8
[INFO] aiokafka.consumer.group_coordinator: Elected group leader -- performing partition assignments using roundrobin
[INFO] aiokafka.consumer.group_coordinator: Joined group 'my_group' (generation 6) with member_id aiokafka-0.8.0-176f0cae-7bad-4620-9955-b182b1f0804f
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 6
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 6
[INFO] aiokafka.consumer.group_coordinator: Successfully synced group my_group with generation 6
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions {TopicPartition(topic='input_data', partition=0)} for group my_group
[INFO] aiokafka.consumer.group_coordinator: Setting newly assigned partitions set() for group my_group
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2410]
INFO: Shutting down
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2411]
INFO: Shutting down
INFO: Waiting for application shutdown.
[INFO] aiokafka.consumer.group_coordinator: LeaveGroup request succeeded
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
INFO: Application shutdown complete.
INFO: Finished server process [2412]
INFO: Stopping parent process [2407]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for fastkafka-0.1.0rc0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f84acb70028e5cd64e7cd5b318e1dc789de1087d56735a9a53c04826be4681f3 |
|
MD5 | a9029864e9be48000dcfcc9f4ab6b795 |
|
BLAKE2b-256 | 13392c458010287b0c6be746b73934186245275c21a01f6b0406695db3f9d546 |