Skip to main content

Thesys GenUI utilities

Project description

Thesys GenUI SDK

⚡ Building application with generative UI ⚡

Quick Install

pip install thesys_genui_sdk

🤔 What is this?

Thesys is a developer platform for building generative UI applications. There are 2 major components of building your own generative UI application:

  • API - Thesys C1 API is our flagship Generative UI API that leverages the power of LLMs to generate UI on the fly.

  • Frontend - GenUI SDK is our React framework that integrates well with the C1 API and allows you to present your end user with a live app that they can interact with.

This library is designed to help you build your own generative UI applications by providing you with the tools to build your own application on top of Thesys C1 API.

Read more abount C1 api here and what is genui here

🤖 Chatbots

📖 Documentation

Streaming response with fast-api:

  1. wrap your route with the with_c1_response decorator:
  2. use write_content, write_think_item, write_custom_markdown from thesys_genui_sdk.context to write to the response stream
  3. use get_assistant_message from thesys_genui_sdk.context to get the assistant message for the history
from thesys_genui_sdk.fast_api import with_c1_response
from thesys_genui_sdk.context import write_content, get_assistant_message, write_think_item

@app.post("/generate_ui")
# decorator to wrap the route with the c1 response instance
# this will create a c1 response instance and make it available in the context
# this expects an async function which will use write_content, write_think_item, write_custom_markdown
# to write to the response stream
@with_c1_response()
async def generate_ui(request: Request):
    await generate_llm_response(request)


async def generate_llm_response(request: Request):
    stream = openai_client.chat. ....

    # write a think item to the response stream
    await write_think_item("Thinking", "I am thinking about the user's request")

    for chunk in stream:
        delta = chunk.choices[0].delta
        finish_reason = chunk.choices[0].finish_reason

        if delta and delta.content:
            # write a content chunk to the response stream
            await write_content(delta.content)

        if finish_reason:
            # get the assistant message for the history
            assistant_message_for_history = get_assistant_message()
            # store the assistant message for the history

Framework independent response streaming:

  1. create a c1 response instance
  2. create an async task and use write_content, write_think_item, write_custom_markdown methods on the c1 response instance to write to the response stream
  3. use c1_response.get_assistant_message to get the assistant message for the history
from thesys_genui_sdk.context import C1Response

c1_response = C1Response()

async def generate_llm_response():
    # write a think item to the response stream
    await c1_response.write_think_item("Thinking", "I am thinking about the user's request")

    # write a markdown chunk to the response stream
    await c1_response.write_custom_markdown("## Hello, world!")

    for chunk in stream:
            delta = chunk.choices[0].delta
            finish_reason = chunk.choices[0].finish_reason

            if delta and delta.content:
                # write a content chunk to the response stream
                await c1_response.write_content(delta.content)

            if finish_reason:
                # get the assistant message for the history
                assistant_message_for_history = c1_response.get_assistant_message()
                # store the assistant message for the history

    # end the response stream
    await c1_response.end()

asyncio.create_task(generate_llm_response())

# use the stream from c1_response to send to the client
stream = c1_response.stream()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

thesys_genui_sdk-0.1.2.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

thesys_genui_sdk-0.1.2-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file thesys_genui_sdk-0.1.2.tar.gz.

File metadata

  • Download URL: thesys_genui_sdk-0.1.2.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for thesys_genui_sdk-0.1.2.tar.gz
Algorithm Hash digest
SHA256 1c1525c0e3c9eff96f440332d921180e279dbd613930954795aaf2b8a63c8602
MD5 79d1497cfcc1a6c3a39e1403e93be5be
BLAKE2b-256 f9d2c1d4ac5fa2a8575f68669ede4c5026b361c6012e3af65d85c7e62341cf95

See more details on using hashes here.

Provenance

The following attestation bundles were made for thesys_genui_sdk-0.1.2.tar.gz:

Publisher: publish-pypi-package.yml on thesysdev/genui-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file thesys_genui_sdk-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for thesys_genui_sdk-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a0f9a5e5956880f2e37430b3b1fa109dbaec9ade82f416843f6d8496b6b5aa8c
MD5 a4ddd2fd1d4dea84d5aebda6396f8027
BLAKE2b-256 09a7c521bae79b61a3e139425dd8f92a4df35ba524e6fe166c6f5c30d470ab8c

See more details on using hashes here.

Provenance

The following attestation bundles were made for thesys_genui_sdk-0.1.2-py3-none-any.whl:

Publisher: publish-pypi-package.yml on thesysdev/genui-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page