Create distributed ASGI applications that pull events from a central Redis queue.
Project description
# Distributed ASGI
Uses Redis to distribute ASGI messages between worker ASGI apps. Workers can be on different machines, they just must be able to connect to the central Redis server.
# Usage
Set custom redis options and key prefixes by subclassing `ASGIRedisProducer`.
```py
# server.py
from distributed_asgi import ASGIRedisProducer
class App(ASGIRedisProducer):
key_prefix = "MYPREFIX"
redis_options = {
"address": "redis://mywebsite.com",
"password": "abc123"
}
```
```py
# worker.py
from distributed_asgi import ASGIRedisConsumer
class ASGIApp:
def __init__(self, scope):
self.scope = scope
async def __call__(self, receive, send):
await send({
"type": "http.response.start",
"status": 200
})
await send({
"type": "http.response.body",
"body": b"Hello World!"
})
app = ASGIRedisConsumer(
host="mywebsite.com",
port="6379",
password="abc123",
cls=ASGIApp,
key_prefix='MYPREFIX'
)
print(f"Starting worker")
app.run()
```
Once you have `worker.py` and `server.py`, use some interface server to run `server.py`.
```
$ uvicorn server:App
```
and run `worker.py` as a normal python script:
```
$ python worker.py
```
ASGI requests received by the ASGIRedisProducer will be enqueued and later dequeued by the ASGIRedisConsumer worker. It should be possible to replace `ASGIApp` in `worker.py` with your favorite ASGI application framework. Maybe Quart for example?
# Future Plans
* Path-based HTTP router that puts requests into different queues based on path. Would allow for
Uses Redis to distribute ASGI messages between worker ASGI apps. Workers can be on different machines, they just must be able to connect to the central Redis server.
# Usage
Set custom redis options and key prefixes by subclassing `ASGIRedisProducer`.
```py
# server.py
from distributed_asgi import ASGIRedisProducer
class App(ASGIRedisProducer):
key_prefix = "MYPREFIX"
redis_options = {
"address": "redis://mywebsite.com",
"password": "abc123"
}
```
```py
# worker.py
from distributed_asgi import ASGIRedisConsumer
class ASGIApp:
def __init__(self, scope):
self.scope = scope
async def __call__(self, receive, send):
await send({
"type": "http.response.start",
"status": 200
})
await send({
"type": "http.response.body",
"body": b"Hello World!"
})
app = ASGIRedisConsumer(
host="mywebsite.com",
port="6379",
password="abc123",
cls=ASGIApp,
key_prefix='MYPREFIX'
)
print(f"Starting worker")
app.run()
```
Once you have `worker.py` and `server.py`, use some interface server to run `server.py`.
```
$ uvicorn server:App
```
and run `worker.py` as a normal python script:
```
$ python worker.py
```
ASGI requests received by the ASGIRedisProducer will be enqueued and later dequeued by the ASGIRedisConsumer worker. It should be possible to replace `ASGIApp` in `worker.py` with your favorite ASGI application framework. Maybe Quart for example?
# Future Plans
* Path-based HTTP router that puts requests into different queues based on path. Would allow for
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for distributed-asgi-0.0.2.dev0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5c9188b5465852f3ea8ab9bad553eaf0771d20d77bdc65195f16cf194d779233 |
|
MD5 | 0c38a564a0d1656e7c14e40a5e1c7943 |
|
BLAKE2b-256 | 4278491d5982a4c391ae5262be6db04cd6feb057f96370dd97a28edef22cc0a5 |
Close
Hashes for distributed_asgi-0.0.2.dev0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 726bfa9bb69209eec7903a2498d1fee5b433aaae74e2fa2b89840cf9af0005cd |
|
MD5 | 8afdac90ca3298a99ea65e98fc2847a6 |
|
BLAKE2b-256 | 5bad7d44ffe2efc53730729fc6a44ab04320b08b088fdde4aedcf8d0d887bff2 |