stream and pipeline processing service
Project description
Service and framework for creating and running processing pipelines for data streams, events and chunks. Pipelines of pypelined are composed from individual elements using the chainlet library. They are built in Python configuration files, from custom objects or pre-defined plugins.
import chainlet
from pypelined.conf import pipelines
@chainlet.funclet
def add_time(chunk):
chunk['tme'] = time.time()
return chunk
process_chain = Socket(10331) >> decode_json() >> stop_if(lambda value: value.get('rcode') == 0) >> \
add_time() >> Telegraf(address=('localhost', 10332), name='chunky')
pipelines.append(process_chain)
Once running, pypelined drives all its processing pipelines in an event driven fashion.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pypelined-0.1.5.tar.gz
(15.5 kB
view hashes)
Built Distribution
Close
Hashes for pypelined-0.1.5-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bd862f4fd5da8a9180a78bc9522a334e9d95f90ee9f95fd56cbded442b8f549b |
|
MD5 | 9f6bcbde38836cb7ff0697afde6b1a58 |
|
BLAKE2b-256 | 7ff5afed6bfd4a9d8246ffdea3cf02caf0c00e0f4cff6f69b22a745b28caac20 |