Skip to main content

EchoStream library for implementing remote nodes

Project description

echostream-node

EchoStream library for implementing remote nodes that can be used in the echostream system.

This package supports creating External Nodes and Managed Node Types, and supports the following EchoStream use cases:

  • An External Node in an External App or Cross Account App that is a stand-alone application or part of another application, using either threading or asyncio.
  • An External Node in a Cross Account App that is an AWS Lambda function. This use case only supports threading.
  • A Managed Node Type, using either threading or asyncio

Installation

Python

pip install echostream-node

AWS Lambda

You may use the publiclally provided layer instead of directly installing echostream-node in your lambda package. This layer includes echostream-node and all of the Python dependencies except those built-in to the AWS Lambda environment for Python.

The Layer arn is:

arn:aws:lambda:us-east-1:226390263822:layer:echostream-node-{version}:1

where {version} is the version of echostream-node that you want, with . replaced with _. For example, for echostream-node==0.3.7 the layer arn would be:

arn:aws:lambda:us-east-1:226390263822:layer:echostream-node-0_3_7:1

Usage

Threading Application Node

from signal import SIGHUP, SIGINT, SIGTERM, signal, strsignal

from echostream_node import Message
from echostream_node.threading import AppNode


class MyExternalNode(AppNode):

    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)
        
    def signal_handler(self, signum: int, _: object) -> None:
        print(f"{strsignal(signum)} received, shutting down")
        self.stop()

    def start(self) -> None:
        super().start()
        signal(SIGHUP, self.signal_handler)
        signal(SIGINT, self.signal_handler)
        signal(SIGTERM, self.signal_handler)

try:
    my_external_node = MyExternalNode()
    my_external_node.start()
    for i in range(100):
        message = my_external_node.create_message(str(i))
        my_external_node.send_message(message)
        my_external_node.audit_message(message)
    my_external_node.join()
except Exception:
    print("Error running node")

Asyncio Application Node

import asyncio

import aiorun
from echostream_node import Message
from echostream_node.asyncio import Node

class MyExternalNode(Node):

    async def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)


async def main(node: Node) -> None:
    try:
        await node.start()
        for i in range(100):
            message = my_external_node.create_message(str(i))
            my_external_node.send_message(message)
            my_external_node.audit_message(message)
        await node.join()
    except asyncio.CancelledError:
        pass
    except Exception:
        print("Error running node")


if __name__ == "__main__":
    aiorun.run(main(MyExternalNode()), stop_on_unhandled_errors=True, use_uvloop=True)

Cross Account Lambda Node

from echostream_node import Message
from echostream_node.threading import LambdaNode

class MyExternalNode(LambdaNode):
    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)
        
MY_EXTERNAL_NODE = MyExternalNode()

def lambda_handler(event, context):
    MY_EXTERNAL_NODE.handle_event(event)

Concurrent vs Sequential Message Processing

By default, all Nodes created using the package will process messages sequentially. This is normally the behavior that you want, as many messaging protocols require guaranteed ordering and therefore sequential processing within your Nodes. If this is the behavior that you require, nothign special is needed to gain it from echostream-node.

However, there are use cases where message ordering is not important but processing speed is. In these cases, you may configure your Node upon creation to concurrently process the messages that it receives.

Making a Threading Application Node Concurrent

If your Node inherits from the echostream_node.threading.AppNode class you can achieve concurrency using either threading or multi-processing. The former is appropriate if your processing is IO bound or your execution platform does not support shared memory (required for multi-processing). The latter is appropriate if your platform supports shared memory and your processing is CPU bound.

Creating A Concurrent Application Node Using Threading

This will create an AppNode that uses the provided ThreadPoolExecutor to concurrently process received Messages. Note that while you can set the maximum number of workers to less than 10, there is no gain to setting it to more than 10 since Nodes will only process up to 10 messages at a time.

from concurrent.futures import ThreadPoolExecutor

from echostream_node import Message
from echostream_node.threading import AppNode

class MyExternalNode(AppNode):

    def __init__(self) -> None:
        super().__init__(executor=ThreadPoolExecutor(max_workers=10))

    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)

Creating A Concurrent Application Node Using Multi-Processing

This will create an AppNode that uses the provided ProcessPoolExecutor to concurrently process received Messages. Note that while you can set the maximum number of workers to less than 10, there is no gain to setting it to more than 10 since Nodes will only process up to 10 messages at a time.

from concurrent.futures import ProcessPoolExecutor

from echostream_node import Message
from echostream_node.threading import AppNode

class MyExternalNode(AppNode):

    def __init__(self) -> None:
        super().__init__(executor=ProcessPoolExecutor(max_workers=10))

    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)

Making a Asyncio Application Node Concurrent

If your Node inherits from the echostream_node.asyncio.Node you can set the Node to process incoming Messages concurrently. There is no setting for the maximum number of tasks; a task is created per received Message.

import asyncio

from echostream_node import Message
from echostream_node.asyncio import Node

class MyExternalNode(Node):

    def __init__(self) -> None:
        super().__init__(concurrent_processing=True)

    async def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)

Making a Lambda Node Concurrent

The AWS Lambda platform does not support shared memory, and therefore will only support concurrency via threading. This will create a LambdaNode that uses an optimized (to your Lambda function's resources) ThreadPoolExecutor to concurrently process received Messages.

from echostream_node import Message
from echostream_node.threading import LambdaNode

class MyExternalNode(LambdaNode):

    def __init__(self) -> None:
        super().__init__(concurrent_processing=True)

    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)

Lambda Nodes and Partial Success Reporting

When you connect an Edge's SQS Queue to the AWS Lambda function implementing your Lambda Node, you can choose to Report Batch Item Failures. This allows your Lambda Node to report partial success back to the SQS Queue, but it does require that your Lambda Node operate differently.

If you wish to take advantage of this, set report_batch_item_failures when you create your Lambda Node. This can be set even if your Node is not concurrent processing.

from echostream_node import Message
from echostream_node.threading import LambdaNode

class MyExternalNode(LambdaNode):

    def __init__(self) -> None:
        super().__init__(report_batch_item_failures=True)

    def handle_received_message(self, *, message: Message, source: str) -> None:
        print(f"Got a message:\n{message.body}")
        self.audit_message(message, source=source)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

echostream-node-0.3.7.tar.gz (24.5 kB view details)

Uploaded Source

Built Distribution

echostream_node-0.3.7-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file echostream-node-0.3.7.tar.gz.

File metadata

  • Download URL: echostream-node-0.3.7.tar.gz
  • Upload date:
  • Size: 24.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for echostream-node-0.3.7.tar.gz
Algorithm Hash digest
SHA256 7a890eecafbd051323635e69196ca5fda726e37dec059b5d3a6660bd0b82c302
MD5 1446271b4de473159279898f10bbfc80
BLAKE2b-256 45424ab0f586f3968b0596652143c3d770043e1cc17fdda4c98ac27059014a0b

See more details on using hashes here.

File details

Details for the file echostream_node-0.3.7-py3-none-any.whl.

File metadata

File hashes

Hashes for echostream_node-0.3.7-py3-none-any.whl
Algorithm Hash digest
SHA256 c7d4e9c42a4a5167baac784c8597b7c08bdc7dc918af95d7a0592738eb7e0f42
MD5 a704b6903032ad30104699e7c8cc2213
BLAKE2b-256 fa7c19600e45dfffaf244c88a8f86ddb3229af929608ec4475f15797f58274f7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page