Skip to main content

Event-driven data pipelines

Project description

Travis build PyPi Documentation

Introduction

The primary use cases of eventkit are

  • to send events between loosely coupled components;

  • to compose all kinds of event-driven data pipelines.

The interface is kept as Pythonic as possible, with familiar names from Python and its libraries where possible. For scheduling asyncio is used and there is seamless integration with it.

See the examples and the introduction notebook to get a true feel for the possibilities.

Installation

pip3 install eventkit

Python version 3.6 or higher is required.

Examples

Create an event and connect two listeners

import eventkit as ev

def f(a, b):
    print(a * b)

def g(a, b):
    print(a / b)

event = ev.Event()
event += f
event += g
event.emit(10, 5)

Create a simple pipeline

import eventkit as ev

event = (
    ev.Sequence('abcde')
    .map(str.upper)
    .enumerate()
)

print(event.run())  # in Jupyter: await event.list()

Output:

[(0, 'A'), (1, 'B'), (2, 'C'), (3, 'D'), (4, 'E')]

Create a pipeline to get a running average and standard deviation

import random
import eventkit as ev

source = ev.Range(1000).map(lambda i: random.gauss(0, 1))

event = source.array(500)[ev.ArrayMean, ev.ArrayStd].zip()

print(event.last().run())  # in Jupyter: await event.last()

Output:

[(0.00790957852672618, 1.0345673260655333)]

Combine async iterators together

import asyncio
import eventkit as ev

async def ait(r):
    for i in r:
        await asyncio.sleep(0.1)
        yield i

async def main():
    async for t in ev.Zip(ait('XYZ'), ait('123')):
        print(t)

asyncio.get_event_loop().run_until_complete(main())  # in Jupyter: await main()

Output:

('X', '1')
('Y', '2')
('Z', '3')

Real-time video analysis pipeline

self.video = VideoStream(conf.CAM_ID)
scene = self.video | FaceTracker | SceneAnalyzer
lastScene = scene.aiter(skip_to_last=True)
async for frame, persons in lastScene:
    ...

Full source code

Distributed computing

The distex library provides a poolmap extension method to put multiple cores or machines to use:

from distex import Pool
import eventkit as ev
import bz2

pool = Pool()
# await pool  # un-comment in Jupyter
data = [b'A' * 1000000] * 1000

pipe = ev.Sequence(data).poolmap(pool, bz2.compress).map(len).mean().last()

print(pipe.run())  # in Jupyter: print(await pipe)
pool.shutdown()

Inspired by:

Documentation

The complete API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eventkit-0.8.8.tar.gz (27.2 kB view details)

Uploaded Source

Built Distribution

eventkit-0.8.8-py3-none-any.whl (31.3 kB view details)

Uploaded Python 3

File details

Details for the file eventkit-0.8.8.tar.gz.

File metadata

  • Download URL: eventkit-0.8.8.tar.gz
  • Upload date:
  • Size: 27.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.25.1 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.9.5

File hashes

Hashes for eventkit-0.8.8.tar.gz
Algorithm Hash digest
SHA256 b20d647f7124ea5709dcf89b165e232247c1f148bd17979780c0c9ca431f8d3f
MD5 f37ef583b5d6f4471483db31eb52d351
BLAKE2b-256 f9d0215e50476fda5ac7183be3fa792bf97babccba9635d59575850da5eed039

See more details on using hashes here.

File details

Details for the file eventkit-0.8.8-py3-none-any.whl.

File metadata

  • Download URL: eventkit-0.8.8-py3-none-any.whl
  • Upload date:
  • Size: 31.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.25.1 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.9.5

File hashes

Hashes for eventkit-0.8.8-py3-none-any.whl
Algorithm Hash digest
SHA256 8afd9f576fad7b4aebd29e940273ea8faf48c99f055dee4d2fac67d02c07e74a
MD5 bc61f1abe9d5148f1b684d20ed06c7e0
BLAKE2b-256 00059b2fe1851fd2c168f8d3ab11259a0d2303731a7c9452f6c9fd4401b989d6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page