Skip to main content

OpenAI helper wrappers for Sigil Python SDK

Project description

Sigil Python Provider Helper: OpenAI

sigil-sdk-openai exposes strict OpenAI-shaped wrappers and mappers for both Chat Completions and Responses.

Installation

pip install sigil-sdk sigil-sdk-openai

Public API

  • Chat Completions namespace:

    • chat.completions.create(...)
    • chat.completions.create_async(...)
    • chat.completions.stream(...)
    • chat.completions.stream_async(...)
    • chat.completions.from_request_response(...)
    • chat.completions.from_stream(...)
  • Responses namespace:

    • responses.create(...)
    • responses.create_async(...)
    • responses.stream(...)
    • responses.stream_async(...)
    • responses.from_request_response(...)
    • responses.from_stream(...)
  • Embeddings namespace:

    • embeddings.create(...)
    • embeddings.create_async(...)
    • embeddings.from_request_response(...)

Integration styles

  • Strict wrappers: call OpenAI and record in one step.
  • Manual instrumentation: call OpenAI directly, then map strict OpenAI request/response payloads with from_request_response or from_stream.

Responses-first wrapper example

from openai import OpenAI
from sigil_sdk import Client, ClientConfig
from sigil_sdk_openai import OpenAIOptions, responses

sigil = Client(ClientConfig())
provider = OpenAI()

response = responses.create(
    sigil,
    {
        "model": "gpt-5",
        "instructions": "Be concise",
        "input": "Summarize rollout status in 3 bullets",
        "max_output_tokens": 300,
    },
    lambda request: provider.responses.create(**request),
    OpenAIOptions(conversation_id="conv-1", agent_name="assistant", agent_version="1.0.0"),
)

Chat Completions stream example

from sigil_sdk_openai import ChatCompletionsStreamSummary, chat

summary = chat.completions.stream(
    sigil,
    {
        "model": "gpt-5",
        "stream": True,
        "messages": [{"role": "user", "content": "Stream a short status update"}],
    },
    lambda request: ChatCompletionsStreamSummary(events=[]),
)

Embeddings example

from sigil_sdk_openai import embeddings

embedding_response = embeddings.create(
    sigil,
    {
        "model": "text-embedding-3-small",
        "input": ["hello", "world"],
    },
    lambda request: provider.embeddings.create(**request),
)

Manual instrumentation example (strict mapper)

from sigil_sdk import GenerationStart, ModelRef
from sigil_sdk_openai import OpenAIOptions, responses

request = {
    "model": "gpt-5",
    "instructions": "Be concise",
    "input": "Summarize rollout status in 3 bullets",
}
opts = OpenAIOptions(
    conversation_id="conv-1",
    agent_name="assistant",
    agent_version="1.0.0",
)

with sigil.start_generation(
    GenerationStart(
        conversation_id=opts.conversation_id,
        agent_name=opts.agent_name,
        agent_version=opts.agent_version,
        model=ModelRef(provider=opts.provider_name, name=request["model"]),
    )
) as rec:
    try:
        response = provider.responses.create(**request)
        rec.set_result(responses.from_request_response(request, response, opts))
    except Exception as exc:
        rec.set_call_error(exc)
        raise

Raw artifacts (debug opt-in)

Raw artifacts are off by default.

Enable with:

OpenAIOptions(raw_artifacts=True)

Artifact names:

  • Chat: openai.chat.request, openai.chat.response, openai.chat.tools, openai.chat.stream_events
  • Responses: openai.responses.request, openai.responses.response, openai.responses.tools, openai.responses.stream_events

Call client.shutdown() during teardown to flush buffered telemetry.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sigil_sdk_openai-0.2.0.tar.gz (14.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sigil_sdk_openai-0.2.0-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file sigil_sdk_openai-0.2.0.tar.gz.

File metadata

  • Download URL: sigil_sdk_openai-0.2.0.tar.gz
  • Upload date:
  • Size: 14.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for sigil_sdk_openai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c88c1c5c4b72c6a0fd81b1d0e18ec1e57abca1674249add721ecd3466fbf4cf9
MD5 f862c4b5f86d114348c7e4a62008bf09
BLAKE2b-256 81fc738fc61a1b163ca9819b858ebec8d71aa7682e03eee20d659c5e74896430

See more details on using hashes here.

Provenance

The following attestation bundles were made for sigil_sdk_openai-0.2.0.tar.gz:

Publisher: python-sdks-publish.yml on grafana/sigil-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file sigil_sdk_openai-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for sigil_sdk_openai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3c1558af29b48cee5f536a9bc51f59d5e924e6e1528b98ecf00e465b0817d3ab
MD5 6479b43eb5d4991ef0dda66666cec61b
BLAKE2b-256 3219a35d337a77867fd2f3132c8a6526c3490e399a4aecf2a84ad12b8a0b1306

See more details on using hashes here.

Provenance

The following attestation bundles were made for sigil_sdk_openai-0.2.0-py3-none-any.whl:

Publisher: python-sdks-publish.yml on grafana/sigil-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page