OpenAI helper wrappers for Sigil Python SDK
Project description
Sigil Python Provider Helper: OpenAI
sigil-sdk-openai exposes strict OpenAI-shaped wrappers and mappers for both Chat Completions and Responses.
Installation
pip install sigil-sdk sigil-sdk-openai
Public API
-
Chat Completions namespace:
chat.completions.create(...)chat.completions.create_async(...)chat.completions.stream(...)chat.completions.stream_async(...)chat.completions.from_request_response(...)chat.completions.from_stream(...)
-
Responses namespace:
responses.create(...)responses.create_async(...)responses.stream(...)responses.stream_async(...)responses.from_request_response(...)responses.from_stream(...)
-
Embeddings namespace:
embeddings.create(...)embeddings.create_async(...)embeddings.from_request_response(...)
Integration styles
- Strict wrappers: call OpenAI and record in one step.
- Manual instrumentation: call OpenAI directly, then map strict OpenAI request/response payloads with
from_request_responseorfrom_stream.
Responses-first wrapper example
from openai import OpenAI
from sigil_sdk import Client, ClientConfig
from sigil_sdk_openai import OpenAIOptions, responses
sigil = Client(ClientConfig())
provider = OpenAI()
response = responses.create(
sigil,
{
"model": "gpt-5",
"instructions": "Be concise",
"input": "Summarize rollout status in 3 bullets",
"max_output_tokens": 300,
},
lambda request: provider.responses.create(**request),
OpenAIOptions(conversation_id="conv-1", agent_name="assistant", agent_version="1.0.0"),
)
Chat Completions stream example
from sigil_sdk_openai import ChatCompletionsStreamSummary, chat
summary = chat.completions.stream(
sigil,
{
"model": "gpt-5",
"stream": True,
"messages": [{"role": "user", "content": "Stream a short status update"}],
},
lambda request: ChatCompletionsStreamSummary(events=[]),
)
Embeddings example
from sigil_sdk_openai import embeddings
embedding_response = embeddings.create(
sigil,
{
"model": "text-embedding-3-small",
"input": ["hello", "world"],
},
lambda request: provider.embeddings.create(**request),
)
Manual instrumentation example (strict mapper)
from sigil_sdk import GenerationStart, ModelRef
from sigil_sdk_openai import OpenAIOptions, responses
request = {
"model": "gpt-5",
"instructions": "Be concise",
"input": "Summarize rollout status in 3 bullets",
}
opts = OpenAIOptions(
conversation_id="conv-1",
agent_name="assistant",
agent_version="1.0.0",
)
with sigil.start_generation(
GenerationStart(
conversation_id=opts.conversation_id,
agent_name=opts.agent_name,
agent_version=opts.agent_version,
model=ModelRef(provider=opts.provider_name, name=request["model"]),
)
) as rec:
try:
response = provider.responses.create(**request)
rec.set_result(responses.from_request_response(request, response, opts))
except Exception as exc:
rec.set_call_error(exc)
raise
Raw artifacts (debug opt-in)
Raw artifacts are off by default.
Enable with:
OpenAIOptions(raw_artifacts=True)
Artifact names:
- Chat:
openai.chat.request,openai.chat.response,openai.chat.tools,openai.chat.stream_events - Responses:
openai.responses.request,openai.responses.response,openai.responses.tools,openai.responses.stream_events
Call client.shutdown() during teardown to flush buffered telemetry.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sigil_sdk_openai-0.2.0.tar.gz.
File metadata
- Download URL: sigil_sdk_openai-0.2.0.tar.gz
- Upload date:
- Size: 14.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c88c1c5c4b72c6a0fd81b1d0e18ec1e57abca1674249add721ecd3466fbf4cf9
|
|
| MD5 |
f862c4b5f86d114348c7e4a62008bf09
|
|
| BLAKE2b-256 |
81fc738fc61a1b163ca9819b858ebec8d71aa7682e03eee20d659c5e74896430
|
Provenance
The following attestation bundles were made for sigil_sdk_openai-0.2.0.tar.gz:
Publisher:
python-sdks-publish.yml on grafana/sigil-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sigil_sdk_openai-0.2.0.tar.gz -
Subject digest:
c88c1c5c4b72c6a0fd81b1d0e18ec1e57abca1674249add721ecd3466fbf4cf9 - Sigstore transparency entry: 1417607379
- Sigstore integration time:
-
Permalink:
grafana/sigil-sdk@3cfc596f2d3ca631ce9fa8fe3107d71c17caf924 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/grafana
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-sdks-publish.yml@3cfc596f2d3ca631ce9fa8fe3107d71c17caf924 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file sigil_sdk_openai-0.2.0-py3-none-any.whl.
File metadata
- Download URL: sigil_sdk_openai-0.2.0-py3-none-any.whl
- Upload date:
- Size: 10.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3c1558af29b48cee5f536a9bc51f59d5e924e6e1528b98ecf00e465b0817d3ab
|
|
| MD5 |
6479b43eb5d4991ef0dda66666cec61b
|
|
| BLAKE2b-256 |
3219a35d337a77867fd2f3132c8a6526c3490e399a4aecf2a84ad12b8a0b1306
|
Provenance
The following attestation bundles were made for sigil_sdk_openai-0.2.0-py3-none-any.whl:
Publisher:
python-sdks-publish.yml on grafana/sigil-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sigil_sdk_openai-0.2.0-py3-none-any.whl -
Subject digest:
3c1558af29b48cee5f536a9bc51f59d5e924e6e1528b98ecf00e465b0817d3ab - Sigstore transparency entry: 1417607492
- Sigstore integration time:
-
Permalink:
grafana/sigil-sdk@3cfc596f2d3ca631ce9fa8fe3107d71c17caf924 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/grafana
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-sdks-publish.yml@3cfc596f2d3ca631ce9fa8fe3107d71c17caf924 -
Trigger Event:
workflow_dispatch
-
Statement type: