Skip to main content

Fabric Python Message Bus Library

Project description

Requirements Status

PyPI

Message Bus Schema and Message Definition

Basic framework for a Fabric Message Bus Schema and Messages for Inter Actor Communication

Overview

Fabric communication across various actors in Control and Measurement Framework is implemented using Apache Kafka.

Apache Kafka is a distributed system designed for streams. It is built to be fault-tolerant, high-throughput, horizontally scalable, and allows geographically distributed data streams and stream processing applications.

Kafka enables event driven implementation of various actors/services. Events are both a Fact and a Trigger. Each fabric actor will be a producer for one topic following the Single Writer Principle and would subscribe to topics from other actors for communication. Messages are exchanged over Kafka using Apache Avro data serialization system.

Requirements

  • Python 3.7+
  • confluent-kafka
  • confluent-kafka[avro]

Installation

$ pip3 install .

Usage

This package implements the interface for producer/consumer APIs to push/read messages to/from Kafka via Avro serialization.

Message and Schema

User is expected to inherit IMessage class(message.py) to define it's own members and over ride to_dict() function. It is also required to define the corresponding AVRO schema pertaining to the derived class. This new schema shall be used in producer and consumers.

Example schema for basic IMessage class is available in (schema/message.avsc)

Producers

AvroProducerApi class implements the base functionality for an Avro Kafka producer. User is expected to inherit this class and override delivery_report method to handle message delivery for asynchronous produce.

Example for usage available at the end of producer.py

Consumers

AvroConsumerApi class implements the base functionality for an Avro Kafka consumer. User is expected to inherit this class and override process_message method to handle message processing for incoming message.

Example for usage available at the end of consumer.py

Admin API

AdminApi class provides support to carry out basic admin functions like create/delete topics/partions etc.

How to bring up a test Kafka cluster to test

Generate Credentials

You must generate CA certificates (or use yours if you already have one) and then generate a keystore and truststore for brokers and clients.

cd $(pwd)/secrets
./create-certs.sh
(Type yes for all "Trust this certificate? [no]:" prompts.)
cd -

Set the environment variable for the secrets directory. This is used in later commands. Make sure that you are in the MessageBus directory.

export KAFKA_SSL_SECRETS_DIR=$(pwd)/secrets

Bring up the containers

You can use the docker-compose.yaml file to bring up a simple Kafka cluster containing

  • broker
  • zookeeper
  • schema registry

Use the below command to bring up the cluster

docker-compose up -d

This should bring up following containers:

docker ps
CONTAINER ID        IMAGE                                    COMMAND                  CREATED             STATUS              PORTS                                                                                        NAMES
189ba0e70b97        confluentinc/cp-schema-registry:latest   "/etc/confluent/dock…"   58 seconds ago      Up 58 seconds       0.0.0.0:8081->8081/tcp                                                                       schemaregistry
49616f1c9b0a        confluentinc/cp-kafka:latest             "/etc/confluent/dock…"   59 seconds ago      Up 58 seconds       0.0.0.0:9092->9092/tcp, 0.0.0.0:19092->19092/tcp                                             broker1
c9d19c82558d        confluentinc/cp-zookeeper:latest         "/etc/confluent/dock…"   59 seconds ago      Up 59 seconds       2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp                                                   zookeeper

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fabric-message-bus-1.4.0b6.tar.gz (38.7 kB view details)

Uploaded Source

Built Distribution

fabric_message_bus-1.4.0b6-py3-none-any.whl (133.0 kB view details)

Uploaded Python 3

File details

Details for the file fabric-message-bus-1.4.0b6.tar.gz.

File metadata

  • Download URL: fabric-message-bus-1.4.0b6.tar.gz
  • Upload date:
  • Size: 38.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/29.0 requests/2.25.1 requests-toolbelt/0.9.1 urllib3/1.26.5 tqdm/4.54.1 importlib-metadata/3.7.3 keyring/21.5.0 rfc3986/1.4.0 colorama/0.4.4 CPython/3.9.7

File hashes

Hashes for fabric-message-bus-1.4.0b6.tar.gz
Algorithm Hash digest
SHA256 fffdf1c731a1adf87baf1f31d161e626d6a5f0f4e866afa2b8bcb0179ca89b65
MD5 4b4d705cbe004328fd0d134623570a87
BLAKE2b-256 eb35ab8f1200284529478a5bcf51d58537d8cba08fe4a7906d98f9532b20b05b

See more details on using hashes here.

File details

Details for the file fabric_message_bus-1.4.0b6-py3-none-any.whl.

File metadata

  • Download URL: fabric_message_bus-1.4.0b6-py3-none-any.whl
  • Upload date:
  • Size: 133.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/29.0 requests/2.25.1 requests-toolbelt/0.9.1 urllib3/1.26.5 tqdm/4.54.1 importlib-metadata/3.7.3 keyring/21.5.0 rfc3986/1.4.0 colorama/0.4.4 CPython/3.9.7

File hashes

Hashes for fabric_message_bus-1.4.0b6-py3-none-any.whl
Algorithm Hash digest
SHA256 286f6980da3a530475403b0a1f5a3780e4e617b6188151d832fa40d4e2e899f2
MD5 bb46eee118bff1505c7b8a234bc38227
BLAKE2b-256 b1358ab95b3347f684f6bcdb7df2ffaf96a94d7b7ff564c9aef0753af2d6ba95

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page