A simple sse implementation with aiohttp, which is specifically designed for chatting with LLM.
Project description
aio-sse-chat
Special python aio sse client module especially for parsing server-sent events (SSE) response from LLM.
Modified from aiohttp-sse-client
Why need this?
Normal SSE packages will not get correct value from streaming LLM response(in case you have not escape \n
to \\n
) since it will not parse the response correctly. This module will parse the response correctly and return the correct value.
Also, LLM request usually need to submit a POST
request while most current aio sse modules choose to raise error when submit a POST
request. Though it is not a good practice to use POST
request to get a streaming response, but it helps a lot for simplifying the code.
Installation
pip install aio-sse-chat
Usage
Create your aiohttp session and use aiosseclient
to wrap the session to do request.
# fastapi side
@app.post('/sse') # support all http methods
async def sse_endpoint(data: dict):
async def f():
for i in range(10):
yield '\n'
await asyncio.sleep(0.2)
return EventSourceResponse(f())
##################
# client side
import aiohttp
from aiossechat import aiosseclient
async for event in aiosseclient(url=some_url, method='post', json=some_data):
print(data, end='', flush=True) # can get single `'\n'` correctly
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for aiossechat-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 19bee9355018b03c41768680c919910dc91f3dcf72c359a60556cea3464ddd30 |
|
MD5 | d68f49305e7ca457083c2b02a6be2ec7 |
|
BLAKE2b-256 | d69d15218f4c3715c99c406dadffbc6416bb2ea63bb4884a9240bc6c6bfe1759 |