Skip to main content

Batch job queue for ML inference.

Project description

# dock

Wrapper around Redis for message queues.

## Installation

```bash
pip install dock # pypi
pip install git+https://github.com/vzhong/dock.git # github
```

## Usage

First, start your Redis server.

```python
# server.py
from dock import Dock
dock = Dock('test')

while True:
msg, respond = dock.recv()
print(msg, respond)
print('got message {}'.format(msg))
respond({
'ack': msg,
'msg': 'hello'
})
```

```python
# client.py
from dock import Dock
dock = Dock('test')

for i in range(5):
answer = dock.send('message{}'.format(i))
print(answer)
```

You can see how the server and client interact by running the two files:

```bash
python server.py # in one terminal
python client.py # in another terminal
```

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dock, version 0.0.1
Filename, size File type Python version Upload date Hashes
Filename, size dock-0.0.1.tar.gz (4.0 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page