Skip to main content

Streaming enabled Assistants API

Project description

streaming_assistants

The official OpenAI Assistants API does not yet support streaming (although this functionality has been hinted at since launch back in November). Streaming is critical for a large subset of genai use cases and there has been significant feedback that it's the major blocker for addption of the Assistants API from many. We decided that we (and our users) couldn't wait so we implemented streaming support in Astra Assistants API.

How to use

Because streaming is not supported in the API, it also isn't supported by the OpenAI SDKs. Rather than forking the SDK project, we created this shim which we will maintain at least until the official implementation catches up. Expect a similar project for js/ts coming soon.

Install streaming_assistants using your python package manager of choice:

poetry add streaming_assistants

import and patch your client:

from openai import OpenAI
from streaming_assistants import patch

client = patch(
    OpenAI(
        base_url="https://open-assistant-ai.astra.datastax.com/v1",
        api_key=OPENAI_API_KEY,
        default_headers={
            "astra-api-token": ASTRA_DB_APPLICATION_TOKEN,
        }
    )
)

...

After creating your run status will go to generating. At this point you can call client.beta.threads.messages.list with streaming=True:

run.id = client.beta.threads.runs.create(
    thread_id=thread.id,
    assistant_id=assistant_id,
)

while (True):
    run = client.beta.threads.runs.retrieve(
        thread_id=thread_id,
        run_id=run.id
    )
    if run.status == 'failed':
        raise ValueError("Run is in failed state")
    if run.status == 'completed' or run.status == 'generating':
        break
    time.sleep(1)


response = client.beta.threads.messages.list(
    thread_id=thread_id,
    stream=True,
)

process the streaming response:

for part in response:
    print(part.data[0].content[0].text.value)

Compatibility

We've done our best to come up with the likeliest design for what OpenAI will release. We also attempted to work in the open on the implementation by sharing this design doc starting a discussion in OpenAI's openapi repo. There are a couple of other projects that are interested in how this functionality will officially be supported. See related tickets here:

That said, we had to make some design decisions that may or may not match what OpenAI will do in their official implementation.

As soon as OpenAI releases official streaming support we will close the compatibility gap as soon as possible while doing our best to support existing users and to avoid breaking changes. This will be a tricky needle to thread but believe that giving folks an option today will be worth the trouble tomorrow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streaming_assistants-0.5.0.tar.gz (5.3 kB view hashes)

Uploaded Source

Built Distribution

streaming_assistants-0.5.0-py3-none-any.whl (5.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page