Skip to main content

A python package that provides functionality to interface with the Confluent Schema Registtry

Project description

primed-avro

version number: 0.0.3.8 author: Matthijs van der Kroon

Overview

A python package that provides:

  • A basic Confluent Schema Registry client
  • Confluent compatible Avro encoding and decoding
  • High level KafkaConsumer that decodes Avro messages on the fly

WARNING: python2.7 not supported

Installation / Usage

To install use pip:

pip install primed_avro

Or clone the repo:

git clone https://gitlab.com/primedio/primed-avro
python setup.py install

Example Confluent Schema Registry client

from primed_avro.registry import ConfluentSchemaRegistryClient

csr = ConfluentSchemaRegistryClient(url="your_registry_url")
schemaMeta = csr.get_schema(subject=topic)

Example Avro en/decoding

from primed_avro.encoder import Encoder
from primed_avro.decoder import Decoder

encoder = Encoder(schema=schemaMeta.schema).get()
bytesvalue = encoder.encode(schemaMeta.id, record)

decoder = Decoder(schema=schemaMeta.schema).get()
record = decoder.decode(bytesvalue)

Example High level KafkaAvroConsumer

from primed_avro.consumer import AvroConsumer

c = AvroConsumer(
    topic="mytopic",
    bootstrap_servers="localhost:9092",
    registry_url="http://localhost:8081"
)

for msg in c.consume():
    print(type(msg), msg)

Contributing

TBD

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for primed-avro, version 0.0.3.8
Filename, size File type Python version Upload date Hashes
Filename, size primed-avro-0.0.3.8.tar.gz (4.8 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page