Streaming enabled Assistants API
Project description
streaming_assistants
The official OpenAI Assistants API does not yet support streaming (although this functionality has been hinted at since launch back in November). Streaming is critical for a large subset of genai use cases and there has been significant feedback that it's the major blocker for addption of the Assistants API from many. We decided that we (and our users) couldn't wait so we implemented streaming support in Astra Assistants API.
How to use
Because streaming is not supported in the API, it also isn't supported by the OpenAI SDKs. Rather than forking the SDK project, we created this shim which we will maintain at least until the official implementation catches up. Expect a similar project for js/ts coming soon.
Install streaming_assistants using your python package manager of choice:
poetry add streaming_assistants
import and patch your client:
from openai import OpenAI
from streaming_assistants import patch
client = patch(
OpenAI(
base_url="https://open-assistant-ai.astra.datastax.com/v1",
api_key=OPENAI_API_KEY,
default_headers={
"astra-api-token": ASTRA_DB_APPLICATION_TOKEN,
}
)
)
...
After creating your run status will go to generating
. At this point you can call client.beta.threads.messages.list
with streaming=True:
run.id = client.beta.threads.runs.create(
thread_id=thread.id,
assistant_id=assistant_id,
)
while (True):
run = client.beta.threads.runs.retrieve(
thread_id=thread_id,
run_id=run.id
)
if run.status == 'failed':
raise ValueError("Run is in failed state")
if run.status == 'completed' or run.status == 'generating':
break
time.sleep(1)
response = client.beta.threads.messages.list(
thread_id=thread_id,
stream=True,
)
process the streaming response:
for part in response:
print(part.data[0].content[0].text.value)
Compatibility
We've done our best to come up with the likeliest design for what OpenAI will release. We also attempted to work in the open on the implementation by sharing this design doc starting a discussion in OpenAI's openapi repo. There are a couple of other projects that are interested in how this functionality will officially be supported. See related tickets here:
That said, we had to make some design decisions that may or may not match what OpenAI will do in their official implementation.
As soon as OpenAI releases official streaming support we will close the compatibility gap as soon as possible while doing our best to support existing users and to avoid breaking changes. This will be a tricky needle to thread but believe that giving folks an option today will be worth the trouble tomorrow.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for streaming_assistants-0.3.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 562f54602f4ae3726902fc1d6d3e3afe5f5c1756ea9e754348cf833cf0e0ff08 |
|
MD5 | e5007d3dedd4f7cd8b98c7c383e86908 |
|
BLAKE2b-256 | 6811315db7366822247f3ab1e99fed43b9a7e230e521aa1011d060a37618cbb9 |
Hashes for streaming_assistants-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d22c140b13bdcb2a08ce9d4c7b5b2901a6e1915cda0ecd11b53e5f9d31e227da |
|
MD5 | 7d2910b69bc61ad5ebbcfb54736d06db |
|
BLAKE2b-256 | a3791c99341548fd221d7bad18420c32ebfb1bb406b3f5bcd9e3bc7f70ac089a |