A simple sse implementation with aiohttp, which is specifically designed for chatting with LLM.
Project description
aio-sse-chat
Special python aio sse client module especially for parsing server-sent events (SSE) response from LLM.
Modified from aiohttp-sse-client
Why need this?
Normal SSE packages will not get correct value from streaming LLM response(in case you have not escape \n
to \\n
) since it will not parse the response correctly. This module will parse the response correctly and return the correct value.
Also, LLM request usually need to submit a POST
request while most current aio sse modules choose to raise error when submit a POST
request. Though it is not a good practice to use POST
request to get a streaming response, but it helps a lot for simplifying the code.
Installation
pip install aio-sse-chat
Usage
Create your aiohttp session and use aiosseclient
to wrap the session to do request.
# fastapi side
@app.post('/sse') # support all http methods
async def sse_endpoint(data: dict):
async def f():
for i in range(10):
yield '\n'
await asyncio.sleep(0.2)
return EventSourceResponse(f())
##################
# client side
import aiohttp
from aiossechat import aiosseclient
async for event in aiosseclient(url=some_url, method='post', json=some_data):
print(data, end='', flush=True) # can get single `'\n'` correctly
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file aiossechat-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: aiossechat-0.0.4-py3-none-any.whl
- Upload date:
- Size: 5.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bdcca767327dfc690ab94d2c01c4466ac1b2dd5b239b0d2bc0c32287a149a715 |
|
MD5 | 57bfed8fe541ef38e1b5eec9fd96d471 |
|
BLAKE2b-256 | 0f3803c1ccfeec85b84544f9441f415a22eeff91430789ae9a28ba44ad1f6d11 |