Skip to main content

Microsoft Azure Schema Registry Avro Serializer Client Library for Python

Project description

Azure Schema Registry Avro Serializer client library for Python

Azure Schema Registry is a schema repository service hosted by Azure Event Hubs, providing schema storage, versioning, and management. This package provides an Avro serializer capable of serializing and deserializing payloads containing Schema Registry schema identifiers and Avro-encoded data.

Source code | Package (PyPi) | API reference documentation | Samples | Changelog

Disclaimer

Azure SDK Python packages support for Python 2.7 is ending 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691

Getting started

Install the package

Install the Azure Schema Registry Avro Serializer client library and Azure Identity client library for Python with pip:

pip install azure-schemaregistry-avroserializer azure-identity

Prerequisites:

To use this package, you must have:

Authenticate the client

Interaction with the Schema Registry Avro Serializer starts with an instance of AvroSerializer class, which takes the schema group name and the Schema Registry Client class. The client constructor takes the Event Hubs fully qualified namespace and and Azure Active Directory credential:

  • The fully qualified namespace of the Schema Registry instance should follow the format: <yournamespace>.servicebus.windows.net.

  • An AAD credential that implements the TokenCredential protocol should be passed to the constructor. There are implementations of the TokenCredential protocol available in the azure-identity package. To use the credential types provided by azure-identity, please install the Azure Identity client library for Python with pip:

pip install azure-identity
  • Additionally, to use the async API supported on Python 3.6+, you must first install an async transport, such as aiohttp:
pip install aiohttp

Create AvroSerializer using the azure-schemaregistry library:

from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

credential = DefaultAzureCredential()
# Namespace should be similar to: '<your-eventhub-namespace>.servicebus.windows.net'
fully_qualified_namespace = '<< FULLY QUALIFIED NAMESPACE OF THE SCHEMA REGISTRY >>'
group_name = '<< GROUP NAME OF THE SCHEMA >>'
schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, credential)
serializer = AvroSerializer(client=schema_registry_client, group_name=group_name)

Key concepts

AvroSerializer

Provides API to serialize to and deserialize from Avro Binary Encoding plus a header with schema ID. Uses SchemaRegistryClient to get schema IDs from schema content or vice versa.

Message format

The same format is used by schema registry serializers across Azure SDK languages.

Messages are encoded as follows:

  • 4 bytes: Format Indicator

    • Currently always zero to indicate format below.
  • 32 bytes: Schema ID

    • UTF-8 hexadecimal representation of GUID.
    • 32 hex digits, no hyphens.
    • Same format and byte order as string from Schema Registry service.
  • Remaining bytes: Avro payload (in general, format-specific payload)

    • Avro Binary Encoding
    • NOT Avro Object Container File, which includes the schema and defeats the purpose of this serialzer to move the schema out of the message payload and into the schema registry.

Examples

The following sections provide several code snippets covering some of the most common Schema Registry tasks, including:

Serialization

Use AvroSerializer.serialize method to serialize dict data with the given avro schema. The method would use a schema previously registered to the Schema Registry service and keep the schema cached for future serialization usage. It is also possible to avoid pre-registering the schema to the service and automatically register with the serialize method by instantiating the AvroSerializer with the keyword argument auto_register_schemas=True.

import os
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
name = "example.avro.User"
format = "Avro"

definition = """
{"namespace": "example.avro",
 "type": "record",
 "name": "User",
 "fields": [
     {"name": "name", "type": "string"},
     {"name": "favorite_number",  "type": ["int", "null"]},
     {"name": "favorite_color", "type": ["string", "null"]}
 ]
}"""

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
schema_register_client.register(group_name, name, definition, format)
serializer = AvroSerializer(client=schema_registry_client, group_name=group_name)

with serializer:
    dict_data = {"name": "Ben", "favorite_number": 7, "favorite_color": "red"}
    encoded_bytes = serializer.serialize(dict_data, schema=definition)

Deserialization

Use AvroSerializer.deserialize method to deserialize raw bytes into dict data. The method automatically retrieves the schema from the Schema Registry Service and keeps the schema cached for future deserialization usage.

import os
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
serializer = AvroSerializer(client=schema_registry_client, group_name=group_name)

with serializer:
    encoded_bytes = b'<data_encoded_by_azure_schema_registry_avro_serializer>'
    decoded_data = serializer.deserialize(encoded_bytes)

Event Hubs Sending Integration

Integration with Event Hubs to send serialized avro dict data as the body of EventData.

import os
from azure.eventhub import EventHubProducerClient, EventData
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR']
eventhub_name = os.environ['EVENT_HUB_NAME']

definition = """
{"namespace": "example.avro",
 "type": "record",
 "name": "User",
 "fields": [
     {"name": "name", "type": "string"},
     {"name": "favorite_number",  "type": ["int", "null"]},
     {"name": "favorite_color", "type": ["string", "null"]}
 ]
}"""

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name, auto_register_schemas=True)

eventhub_producer = EventHubProducerClient.from_connection_string(
    conn_str=eventhub_connection_str,
    eventhub_name=eventhub_name
)

with eventhub_producer, avro_serializer:
    event_data_batch = eventhub_producer.create_batch()
    dict_data = {"name": "Bob", "favorite_number": 7, "favorite_color": "red"}
    payload_bytes = avro_serializer.serialize(dict_data, schema=definition)
    event_data_batch.add(EventData(body=payload_bytes))
    eventhub_producer.send_batch(event_data_batch)

Event Hubs Receiving Integration

Integration with Event Hubs to receive EventData and deserialized raw bytes into avro dict data.

import os
from azure.eventhub import EventHubConsumerClient
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

token_credential = DefaultAzureCredential()
fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE']
group_name = "<your-group-name>"
eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR']
eventhub_name = os.environ['EVENT_HUB_NAME']

schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential)
avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name)

eventhub_consumer = EventHubConsumerClient.from_connection_string(
    conn_str=eventhub_connection_str,
    consumer_group='$Default',
    eventhub_name=eventhub_name,
)

def on_event(partition_context, event):
    bytes_payload = b"".join(b for b in event.body)
    deserialized_data = avro_serializer.deserialize(bytes_payload)

with eventhub_consumer, avro_serializer:
    eventhub_consumer.receive(on_event=on_event, starting_position="-1")

Troubleshooting

General

Azure Schema Registry Avro Serializer raise exceptions defined in Azure Core.

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument:

import sys
import logging
from azure.schemaregistry import SchemaRegistryClient
from azure.schemaregistry.serializer.avroserializer import AvroSerializer
from azure.identity import DefaultAzureCredential

# Create a logger for the SDK
logger = logging.getLogger('azure.schemaregistry')
logger.setLevel(logging.DEBUG)

# Configure a console output
handler = logging.StreamHandler(stream=sys.stdout)
logger.addHandler(handler)

credential = DefaultAzureCredential()
schema_registry_client = SchemaRegistryClient("<your-fully_qualified_namespace>", credential, logging_enable=True)
# This client will log detailed information about its HTTP sessions, at DEBUG level
serializer = AvroSerializer(client=schema_registry_client, group_name="<your-group-name>")

Similarly, logging_enable can enable detailed logging for a single operation, even when it isn't enabled for the client:

serializer.serialize(dict_data, schema=schema_definition, logging_enable=True)

Next steps

More sample code

Please find further examples in the samples directory demonstrating common Azure Schema Registry Avro Serializer scenarios.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Release History

1.0.0b4 (2021-11-11)

Features Added

  • Async version of AvroSerializer has been added under azure.schemaregistry.serializer.avroserializer.aio.
  • Depends on azure-schemaregistry>=1.0.0,<2.0.0.

Breaking Changes

  • SchemaParseError, SchemaSerializationError, and SchemaDeserializationError have been introduced under azure.schemaregistry.serializer.avroserializer.exceptions and will be raised for corresponding operations.
    • SchemaParseError and SchemaSerializationError may be raised for errors when calling serialize on AvroSerializer.
    • SchemaParseError and SchemaDeserializationError may be raised for errors when calling deserialize on AvroSerializer.

1.0.0b3 (2021-10-06)

Features Added

  • auto_register_schemas keyword argument has been added to AvroSerializer, which will allow for automatically registering schemas passed in to the serialize, when set to True, otherwise False by default.
  • value parameter in serialize on AvroSerializer takes type Mapping rather than Dict.
  • Depends on azure-schemaregistry==1.0.0b3.

Breaking Changes

  • SchemaRegistryAvroSerializer has been renamed AvroSerializer.
  • schema_registry parameter in the AvroSerializer constructor has been renamed client.
  • schema_group parameter in the AvroSerializer constructor has been renamed group_name.
  • data parameter in the serialize and deserialize methods on AvroSerializer has been renamed value.
  • schema parameter in the serialize method on AvroSerializer no longer accepts argument of type bytes.
  • AvroSerializer constructor no longer takes in the codec keyword argument.
  • The following positional arguments are now required keyword arguments:
    • client and group_name in AvroSerializer constructor
    • schema in serialize on AvroSerializer

1.0.0b2 (2021-08-18)

This version and all future versions will require Python 2.7 or Python 3.6+, Python 3.5 is no longer supported.

Features Added

  • Depends on azure-schemaregistry==1.0.0b2 which supports client-level caching.

1.0.0b1 (2020-09-09)

Version 1.0.0b1 is the first preview of our efforts to create a user-friendly and Pythonic client library for Azure Schema Registry Avro Serializer.

New features

  • SchemaRegistryAvroSerializer is the top-level client class that provides the functionality to encode and decode avro data utilizing the avro library. It will automatically register schema and retrieve schema from Azure Schema Registry Service. It provides two methods:
    • serialize: Serialize dict data into bytes according to the given schema and register schema if needed.
    • deserialize: Deserialize bytes data into dict data by automatically retrieving schema from the service.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure-schemaregistry-avroserializer-1.0.0b4.zip (57.0 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page