Skip to main content

OpenAI helper wrappers for Sigil Python SDK

Project description

Sigil Python Provider Helper: OpenAI

sigil-sdk-openai exposes strict OpenAI-shaped wrappers and mappers for both Chat Completions and Responses.

Installation

pip install sigil-sdk sigil-sdk-openai

Public API

  • Chat Completions namespace:

    • chat.completions.create(...)
    • chat.completions.create_async(...)
    • chat.completions.stream(...)
    • chat.completions.stream_async(...)
    • chat.completions.from_request_response(...)
    • chat.completions.from_stream(...)
  • Responses namespace:

    • responses.create(...)
    • responses.create_async(...)
    • responses.stream(...)
    • responses.stream_async(...)
    • responses.from_request_response(...)
    • responses.from_stream(...)
  • Embeddings namespace:

    • embeddings.create(...)
    • embeddings.create_async(...)
    • embeddings.from_request_response(...)

Integration styles

  • Strict wrappers: call OpenAI and record in one step.
  • Manual instrumentation: call OpenAI directly, then map strict OpenAI request/response payloads with from_request_response or from_stream.

Responses-first wrapper example

from openai import OpenAI
from sigil_sdk import Client, ClientConfig
from sigil_sdk_openai import OpenAIOptions, responses

sigil = Client(ClientConfig())
provider = OpenAI()

response = responses.create(
    sigil,
    {
        "model": "gpt-5",
        "instructions": "Be concise",
        "input": "Summarize rollout status in 3 bullets",
        "max_output_tokens": 300,
    },
    lambda request: provider.responses.create(**request),
    OpenAIOptions(conversation_id="conv-1", agent_name="assistant", agent_version="1.0.0"),
)

Chat Completions stream example

from sigil_sdk_openai import ChatCompletionsStreamSummary, chat

summary = chat.completions.stream(
    sigil,
    {
        "model": "gpt-5",
        "stream": True,
        "messages": [{"role": "user", "content": "Stream a short status update"}],
    },
    lambda request: ChatCompletionsStreamSummary(events=[]),
)

Embeddings example

from sigil_sdk_openai import embeddings

embedding_response = embeddings.create(
    sigil,
    {
        "model": "text-embedding-3-small",
        "input": ["hello", "world"],
    },
    lambda request: provider.embeddings.create(**request),
)

Manual instrumentation example (strict mapper)

from sigil_sdk import GenerationStart, ModelRef
from sigil_sdk_openai import OpenAIOptions, responses

request = {
    "model": "gpt-5",
    "instructions": "Be concise",
    "input": "Summarize rollout status in 3 bullets",
}
opts = OpenAIOptions(
    conversation_id="conv-1",
    agent_name="assistant",
    agent_version="1.0.0",
)

with sigil.start_generation(
    GenerationStart(
        conversation_id=opts.conversation_id,
        agent_name=opts.agent_name,
        agent_version=opts.agent_version,
        model=ModelRef(provider=opts.provider_name, name=request["model"]),
    )
) as rec:
    try:
        response = provider.responses.create(**request)
        rec.set_result(responses.from_request_response(request, response, opts))
    except Exception as exc:
        rec.set_call_error(exc)
        raise

Raw artifacts (debug opt-in)

Raw artifacts are off by default.

Enable with:

OpenAIOptions(raw_artifacts=True)

Artifact names:

  • Chat: openai.chat.request, openai.chat.response, openai.chat.tools, openai.chat.stream_events
  • Responses: openai.responses.request, openai.responses.response, openai.responses.tools, openai.responses.stream_events

Call client.shutdown() during teardown to flush buffered telemetry.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sigil_sdk_openai-0.1.2.tar.gz (14.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sigil_sdk_openai-0.1.2-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file sigil_sdk_openai-0.1.2.tar.gz.

File metadata

  • Download URL: sigil_sdk_openai-0.1.2.tar.gz
  • Upload date:
  • Size: 14.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for sigil_sdk_openai-0.1.2.tar.gz
Algorithm Hash digest
SHA256 fc394381aa6b2415da576ecdb9811df8826be54948a72ec9ef3f7075485d79a0
MD5 11d1f4b86402f0ba08423565d65c6674
BLAKE2b-256 cf19d01bd234bde21002e79852d3d4acd470399fad57ce550df0c9064c4ca2b3

See more details on using hashes here.

Provenance

The following attestation bundles were made for sigil_sdk_openai-0.1.2.tar.gz:

Publisher: python-sdks-publish.yml on grafana/sigil

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file sigil_sdk_openai-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for sigil_sdk_openai-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 256d78095309fb7dc6d538b40b81616b4b904be0b6e01e05dc815e954128def3
MD5 c593e27fe00e45dc2b52e3441bb4d2a2
BLAKE2b-256 e9cad9a96f27f30f1bde46cb9dd1e32baa3f90d94dd39b8b108dd402cc733e5e

See more details on using hashes here.

Provenance

The following attestation bundles were made for sigil_sdk_openai-0.1.2-py3-none-any.whl:

Publisher: python-sdks-publish.yml on grafana/sigil

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page