Skip to main content

Arista Protobuf / Python gRPC bindings generator & library

Project description

Arista Protobuf / Python gRPC bindings generator & library

This was originally forked from https://github.com/danielgtaylor/python-betterproto @ b8a091ae7055dd949d193695a06c9536ad51eea8.

Afterwards commits up to 1f88b67eeb9871d33da154fd2c859b9d1aed62c1 on python-betterproto have been cherry-picked.

Changes in this project compared with the base project:

  • Renamed to aristaproto.
  • Cut support for Python < 3.9.
  • Updating various CI actions and dependencies.
  • Merged docs from multiple rst files to MarkDown.
  • Keep nanosecond precision for Timestamp.
    • Subclass datetime to store the original nano-second value when converting from Timestamp to datetime.
    • On conversion from the subclass of datetime to Timestamp the original nano-second value is restored.

Installation

First, install the package. Note that the [compiler] feature flag tells it to install extra dependencies only needed by the protoc plugin:

# Install both the library and compiler
pip install "aristaproto[compiler]"

# Install just the library (to use the generated code output)
pip install aristaproto

Getting Started

Compiling proto files

Given you installed the compiler and have a proto file, e.g example.proto:

syntax = "proto3";

package hello;

// Greeting represents a message you can tell a user.
message Greeting {
  string message = 1;
}

You can run the following to invoke protoc directly:

mkdir lib
protoc -I . --python_aristaproto_out=lib example.proto

or run the following to invoke protoc via grpcio-tools:

pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_aristaproto_out=lib example.proto

This will generate lib/hello/__init__.py which looks like:

# Generated by the protocol buffer compiler.  DO NOT EDIT!
# sources: example.proto
# plugin: python-aristaproto
from dataclasses import dataclass

import aristaproto


@dataclass
class Greeting(aristaproto.Message):
    """Greeting represents a message you can tell a user."""

    message: str = aristaproto.string_field(1)

Now you can use it!

>>> from lib.hello import Greeting
>>> test = Greeting()
>>> test
Greeting(message='')

>>> test.message = "Hey!"
>>> test
Greeting(message="Hey!")

>>> serialized = bytes(test)
>>> serialized
b'\n\x04Hey!'

>>> another = Greeting().parse(serialized)
>>> another
Greeting(message="Hey!")

>>> another.to_dict()
{"message": "Hey!"}
>>> another.to_json(indent=2)
'{\n  "message": "Hey!"\n}'

Async gRPC Support

The generated Protobuf Message classes are compatible with grpclib so you are free to use it if you like. That said, this project also includes support for async gRPC stub generation with better static type checking and code completion support. It is enabled by default.

Given an example service definition:

syntax = "proto3";

package echo;

message EchoRequest {
  string value = 1;
  // Number of extra times to echo
  uint32 extra_times = 2;
}

message EchoResponse {
  repeated string values = 1;
}

message EchoStreamResponse  {
  string value = 1;
}

service Echo {
  rpc Echo(EchoRequest) returns (EchoResponse);
  rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}

Generate echo proto file:

python -m grpc_tools.protoc -I . --python_aristaproto_out=. echo.proto

A client can be implemented as follows:

import asyncio
import echo

from grpclib.client import Channel


async def main():
    channel = Channel(host="127.0.0.1", port=50051)
    service = echo.EchoStub(channel)
    response = await service.echo(echo.EchoRequest(value="hello", extra_times=1))
    print(response)

    async for response in service.echo_stream(echo.EchoRequest(value="hello", extra_times=1)):
        print(response)

    # don't forget to close the channel when done!
    channel.close()


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

which would output

EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')

This project also produces server-facing stubs that can be used to implement a Python gRPC server. To use them, simply subclass the base class in the generated files and override the service methods:

import asyncio
from echo import EchoBase, EchoRequest, EchoResponse, EchoStreamResponse
from grpclib.server import Server
from typing import AsyncIterator


class EchoService(EchoBase):
    async def echo(self, echo_request: "EchoRequest") -> "EchoResponse":
        return EchoResponse([echo_request.value for _ in range(echo_request.extra_times)])

    async def echo_stream(self, echo_request: "EchoRequest") -> AsyncIterator["EchoStreamResponse"]:
        for _ in range(echo_request.extra_times):
            yield EchoStreamResponse(echo_request.value)


async def main():
    server = Server([EchoService()])
    await server.start("127.0.0.1", 50051)
    await server.wait_closed()

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

JSON

Both serializing and parsing are supported to/from JSON and Python dictionaries using the following methods:

  • Dicts: Message().to_dict(), Message().from_dict(...)
  • JSON: Message().to_json(), Message().from_json(...)

For compatibility the default is to convert field names to camelCase. You can control this behavior by passing a casing value, e.g:

MyMessage().to_dict(casing=aristaproto.Casing.SNAKE)

Determining if a message was sent

Sometimes it is useful to be able to determine whether a message has been sent on the wire. This is how the Google wrapper types work to let you know whether a value is unset, set as the default (zero value), or set as something else, for example.

Use aristaproto.serialized_on_wire(message) to determine if it was sent. This is a little bit different from the official Google generated Python code, and it lives outside the generated Message class to prevent name clashes. Note that it only supports Proto 3 and thus can only be used to check if Message fields are set. You cannot check if a scalar was sent on the wire.

# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')

# New way (this project)
>>> aristaproto.serialized_on_wire(mymessage.myfield)

One-of Support

Protobuf supports grouping fields in a oneof clause. Only one of the fields in the group may be set at a given time. For example, given the proto:

syntax = "proto3";

message Test {
  oneof foo {
    bool on = 1;
    int32 count = 2;
    string name = 3;
  }
}

On Python 3.10 and later, you can use a match statement to access the provided one-of field, which supports type-checking:

test = Test()
match test:
    case Test(on=value):
        print(value)  # value: bool
    case Test(count=value):
        print(value)  # value: int
    case Test(name=value):
        print(value)  # value: str
    case _:
        print("No value provided")

You can also use aristaproto.which_one_of(message, group_name) to determine which of the fields was set. It returns a tuple of the field name and value, or a blank string and None if unset.

>>> test = Test()
>>> aristaproto.which_one_of(test, "foo")
["", None]

>>> test.on = True
>>> aristaproto.which_one_of(test, "foo")
["on", True]

# Setting one member of the group resets the others.
>>> test.count = 57
>>> aristaproto.which_one_of(test, "foo")
["count", 57]

# Default (zero) values also work.
>>> test.name = ""
>>> aristaproto.which_one_of(test, "foo")
["name", ""]

Again this is a little different than the official Google code generator:

# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"

# New way (this project)
>>> aristaproto.which_one_of(message, "group")
["foo", "foo's value"]

Well-Known Google Types

Google provides several well-known message types like a timestamp, duration, and several wrappers used to provide optional zero value support. Each of these has a special JSON representation and is handled a little differently from normal messages. The Python mapping for these is as follows:

Google Message Python Type Default
google.protobuf.duration datetime.timedelta 0
google.protobuf.timestamp Timezone-aware datetime.datetime 1970-01-01T00:00:00Z
google.protobuf.*Value Optional[...] None
google.protobuf.* aristaproto.lib.google.protobuf.* None

For the wrapper types, the Python type corresponds to the wrapped type, e.g. google.protobuf.BoolValue becomes Optional[bool] while google.protobuf.Int32Value becomes Optional[int]. All of the optional values default to None, so don't forget to check for that possible state. Given:

syntax = "proto3";

import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";

message Test {
  google.protobuf.BoolValue maybe = 1;
  google.protobuf.Timestamp ts = 2;
  google.protobuf.Duration duration = 3;
}

You can do stuff like:

>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))

>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)

>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'

>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}

Generating Pydantic Models

You can use python-aristaproto to generate pydantic based models, using pydantic dataclasses. This means the results of the protobuf unmarshalling will be typed checked. The usage is the same, but you need to add a custom option when calling the protobuf compiler:

protoc -I . --python_aristaproto_opt=pydantic_dataclasses --python_aristaproto_out=lib example.proto

With the important change being --python_aristaproto_opt=pydantic_dataclasses. This will swap the dataclass implementation from the builtin python dataclass to the pydantic dataclass. You must have pydantic as a dependency in your project for this to work.

Development

Requirements

  • Python (3.9 or higher)

  • poetry Needed to install dependencies in a virtual environment

  • poethepoet for running development tasks as defined in pyproject.toml

    • Can be installed to your host environment via pip install poethepoet then executed as simple poe
    • or run from the poetry venv as poetry run poe

Setup

# Get set up with the virtual env & dependencies
poetry install -E compiler

# Activate the poetry environment
poetry shell

Code style

This project enforces black python code formatting.

Before committing changes run:

poe format

To avoid merge conflicts later, non-black formatted python code will fail in CI.

Tests

There are two types of tests:

  1. Standard tests
  2. Custom tests

Standard tests

Adding a standard test case is easy.

  • Create a new directory aristaproto/tests/inputs/<name>
    • add <name>.proto with a message called Test
    • add <name>.json with some test data (optional)

It will be picked up automatically when you run the tests.

Custom tests

Custom tests are found in tests/test_*.py and are run with pytest.

Running

Here's how to run the tests.

# Generate assets from sample .proto files required by the tests
poe generate
# Run the tests
poe test

To run tests as they are run in CI (with tox) run:

poe full-test

(Re)compiling Google Well-known Types

Betterproto includes compiled versions for Google's well-known types at src/aristaproto/lib/google. Be sure to regenerate these files when modifying the plugin output format, and validate by running the tests.

Normally, the plugin does not compile any references to google.protobuf, since they are pre-compiled. To force compilation of google.protobuf, use the option --custom_opt=INCLUDE_GOOGLE.

Assuming your google.protobuf source files (included with all releases of protoc) are located in /usr/local/include, you can regenerate them as follows:

protoc \
    --plugin=protoc-gen-custom=src/aristaproto/plugin/main.py \
    --custom_opt=INCLUDE_GOOGLE \
    --custom_out=src/aristaproto/lib \
    -I /usr/local/include/ \
    /usr/local/include/google/protobuf/*.proto

License

Copyright 2023 Arista Networks

Copyright 2019-2023 Daniel G. Taylor

This software is free to use under the MIT license. See the LICENSE file for license text.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

aristaproto-0.1.2-py3-none-any.whl (100.5 kB view details)

Uploaded Python 3

File details

Details for the file aristaproto-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: aristaproto-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 100.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for aristaproto-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7c28055bbc25930a0b2cf2ffd987374e54cd6482b4e086d3e24be45cf665f1d3
MD5 c6e93141be99283dbfebf3df7abfb91b
BLAKE2b-256 b05bb6dfc880f3dcafb46003ad027f873bf4a2302bb0c85bc91583072124856c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page