Skip to main content

Azure cloud services SDK with Storage (blob, file share, queue), Key Vault, Cosmos DB, AI Foundry Projects (agents, deployments, evaluations), Document Intelligence, Speech, OpenTelemetry tracing, AI Foundry GenAI tracing, and builder patterns.

Project description

azpaddypy

Azure cloud services SDK with Storage (blob, append blob, file share, queue), Key Vault, Cosmos DB, AI Foundry Projects (agents, deployments, evaluations), Document Intelligence, Speech, OpenTelemetry tracing, AI Foundry GenAI tracing, and builder patterns.

Designed for Python 3.11+ running in Dockerized Azure Function Apps and Web Apps.

Installation

uv add azpaddypy

Quick Start

from azpaddypy import AzureStorage, AzureIdentity, create_azure_storage

# Factory function (cached instances, auto-creates identity)
storage = create_azure_storage(
    account_url="https://myaccount.blob.core.windows.net/",
    service_name="my_service",
)

# Or explicit identity
identity = AzureIdentity(service_name="my_service")
storage = AzureStorage(
    account_url="https://myaccount.blob.core.windows.net/",
    azure_identity=identity,
    enable_file_storage=True,
)

Storage Operations

Blob Storage

# Upload
storage.upload_blob(
    container_name="documents",
    blob_name="report.pdf",
    data=pdf_bytes,
    content_type="application/pdf",
    metadata={"author": "team"},
)

# Download (returns None if not found)
data = storage.download_blob(container_name="documents", blob_name="report.pdf")

# Upload and get SAS URL
sas_url = storage.upload_blob_with_sas(
    container_name="documents",
    blob_name="report.pdf",
    data=pdf_bytes,
    sas_permission="r",
    sas_expiry_delta=timedelta(hours=3),
)

# List, exists, delete
blobs = storage.list_blobs(container_name="documents", name_starts_with="reports/")
exists = storage.blob_exists(container_name="documents", blob_name="report.pdf")
storage.delete_blob(container_name="documents", blob_name="report.pdf")

# Metadata upsert (merges with existing)
storage.upsert_blob_metadata(
    container_name="documents",
    blob_name="report.pdf",
    metadata={"status": "processed"},
)

# SAS token generation
blob_sas = storage.get_blob_sas(container_name="docs", blob_name="file.pdf")
container_sas = storage.get_container_sas(container_name="docs", permission="r")

Append Blob Storage

Append blobs are optimized for append operations such as logging, auditing, or streaming data. Each append block can be up to 4 MiB. Unlike block blobs, append blobs do not support overwriting existing content.

# Create an empty append blob
storage.create_append_blob(
    container_name="logs",
    blob_name="app-2026-04-05.log",
    content_type="text/plain; charset=utf-8",
    metadata={"source": "web-app"},
)

# Append data blocks
storage.append_block(
    container_name="logs",
    blob_name="app-2026-04-05.log",
    data="2026-04-05T10:00:00Z INFO Application started\n",
)

storage.append_block(
    container_name="logs",
    blob_name="app-2026-04-05.log",
    data=b"2026-04-05T10:00:01Z DEBUG Connection pool initialized\n",
)

# Convenience: create-if-missing + append in one call
storage.append_blob_from_text(
    container_name="logs",
    blob_name="app-2026-04-05.log",
    text="2026-04-05T10:05:00Z WARN High memory usage\n",
    create_if_not_exists=True,  # default, skips creation if blob already exists
)

File Share Storage

Requires enable_file_storage=True. Uses Azure File Shares (SMB/NFS), not blob storage.

storage = AzureStorage(
    account_url="https://myaccount.blob.core.windows.net/",
    azure_identity=identity,
    enable_file_storage=True,
)

# Upload (auto-creates parent directories)
storage.upload_share_file(
    share_name="myshare",
    file_path="reports/2026/q1.pdf",
    data=pdf_bytes,
    content_type="application/pdf",
)

# Download (returns None if not found)
data = storage.download_share_file(share_name="myshare", file_path="reports/2026/q1.pdf")

# List files and directories
items = storage.list_share_files(share_name="myshare", directory_path="reports/2026")
# Returns: [{"name": "q1.pdf", "is_directory": False, "size": 1024}, ...]

# Exists, properties, delete
exists = storage.share_file_exists(share_name="myshare", file_path="reports/2026/q1.pdf")
props = storage.get_share_file_properties(share_name="myshare", file_path="reports/2026/q1.pdf")
storage.delete_share_file(share_name="myshare", file_path="reports/2026/q1.pdf")

# Directory management
storage.create_share_directory(share_name="myshare", directory_path="reports/2026/q2")
storage.delete_share_directory(share_name="myshare", directory_path="reports/2026/q2")

# Metadata upsert (merges with existing)
storage.upsert_share_file_metadata(
    share_name="myshare",
    file_path="reports/2026/q1.pdf",
    metadata={"reviewed": "true"},
)

Queue Storage

# Send
storage.send_message(
    queue_name="tasks",
    content='{"task": "process"}',
    visibility_timeout=30,
    time_to_live=3600,
)

# Receive
messages = storage.receive_messages(queue_name="tasks", messages_per_page=5)
for msg in messages:
    print(msg["id"], msg["content"])
    storage.delete_message(
        queue_name="tasks",
        message_id=msg["id"],
        pop_receipt=msg["pop_receipt"],
    )

Builder Pattern

For complex multi-resource setups:

from azpaddypy.builder import AzureManagementBuilder, AzureResourceBuilder
from azpaddypy.builder.directors import ConfigurationSetupDirector

# One-liner setup with director
config = ConfigurationSetupDirector.default_setup(
    service_name="my_app",
    service_version="1.0.0",
)

# Or step-by-step with builders
mgmt = (
    AzureManagementBuilder()
    .with_logger(service_name="my_app")
    .with_identity()
    .with_keyvault(vault_url="https://myvault.vault.azure.net/")
    .build()
)

resources = (
    AzureResourceBuilder(mgmt, env_config)
    .with_storage("default", enable_blob=True, enable_queue=True)
    .with_storage("archive", account_url="https://archive.blob.core.windows.net/", enable_file=True)
    .with_ai_project(endpoint="https://my-ai.services.ai.azure.com/api/projects/my-project")
    .with_document_intelligence(endpoint="https://my-ai.cognitiveservices.azure.com/")
    .with_speech(
        region="westeurope",
        resource_id="/subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<ai-services>",
    )
    .build()
)

storage = resources.get_storage("default")
archive = resources.get_storage("archive")
ai_project = resources.get_ai_project("default")
doc_intel = resources.get_document_intelligence("default")
speech = resources.get_speech("default")

Note: Document Intelligence and Speech are configured exclusively through mgmt_config (typically from Key Vault secrets). They have no environment-variable fallbacks — pass endpoint (and for Speech, region + resource_id) explicitly.

Key Vault

from azpaddypy import AzureKeyVault, create_azure_keyvault

kv = create_azure_keyvault(
    vault_url="https://myvault.vault.azure.net/",
    service_name="my_service",
)

secret = kv.get_secret("database-connection-string")

AI Foundry Projects

Manage Azure AI Foundry agents, deployments, and connections with integrated OpenAI client support.

from azpaddypy import AzureAIProject, create_azure_ai_project

# Factory function (cached instances, auto-creates identity)
ai = create_azure_ai_project(
    endpoint="https://my-ai.services.ai.azure.com/api/projects/my-project",
    service_name="my_service",
)

# List deployments
deployments = ai.list_deployments()

# Get an authenticated OpenAI client
openai_client = ai.get_openai_client()

# Agent operations
from azure.ai.projects.models import PromptAgentDefinition

agent = ai.create_agent(
    agent_name="my-agent",
    definition=PromptAgentDefinition(model="gpt-4o", instructions="You are helpful"),
)

agents = ai.list_agents()
details = ai.get_agent(agent_name="my-agent")

# Invoke an agent via OpenAI responses API
result = ai.invoke_agent(agent_name="my-agent", user_message="Hello")
print(result["response"])

# Connections
connections = ai.list_connections()
connection = ai.get_connection(name="my-openai-connection", include_credentials=True)

# Fetch the Application Insights connection string linked to this Foundry project
# (requires the linkage to have been configured once via portal -> Project -> Tracing).
# Returns None if no App Insights resource is linked.
conn_str = ai.get_application_insights_connection_string()

Feature Flags

ai = AzureAIProject(
    endpoint="https://my-ai.services.ai.azure.com/api/projects/my-project",
    azure_identity=identity,
    enable_agents=True,       # Agent CRUD + invocation
    enable_deployments=True,  # List/get model deployments
    enable_connections=False,  # Disable connection enumeration
)

Evaluation

Run AI quality and safety evaluations against your model outputs. Results appear in the AI Foundry portal under Evaluation with average scores and per-row details.

from mgmt_config import ai_projects

ai = ai_projects["aiservices"]

# One-shot: create eval + run + poll + return results
result = ai.evaluate(
    name="my-eval",
    evaluator_names=[
        # Quality (need a judge model)
        "builtin.coherence",
        "builtin.groundedness",
        "builtin.relevance",
        # Safety (no judge model needed)
        "builtin.violence",
        "builtin.hate_unfairness",
    ],
    data=[
        {
            "query": "What is Azure?",
            "response": "Azure is Microsoft's cloud platform.",
            "context": "Azure documentation overview.",
        },
        {
            "query": "How do I deploy?",
            "response": "Use az webapp deploy.",
            "context": "CLI deployment docs.",
        },
    ],
    judge_model="gpt-4o",  # required for quality evaluators
    cleanup=True,           # delete eval definition after getting results
)

print(result["status"])        # "completed" or "failed"
print(result["report_url"])    # portal link to the evaluation report
print(result["result_counts"]) # aggregate pass/fail counts
for item in result["output_items"]:
    print(item)                # per-row evaluator scores

Available built-in evaluators:

Category Evaluators Judge model required
Quality builtin.coherence, builtin.fluency, builtin.groundedness, builtin.relevance, builtin.similarity, builtin.task_adherence Yes
NLP builtin.f1_score, builtin.bleu_score, builtin.rouge_score, builtin.meteor_score, builtin.gleu_score No
Safety builtin.violence, builtin.sexual, builtin.self_harm, builtin.hate_unfairness, builtin.prohibited_actions, builtin.sensitive_data_leakage No

For finer control, use the individual methods: create_evaluation(), run_evaluation(), get_evaluation_run(), get_evaluation_run_output_items(), list_evaluations(), delete_evaluation().

Document Intelligence

Analyze documents using Azure AI Document Intelligence (formerly Form Recognizer). Shares the same Cognitive Services / AI Services account as AI Foundry.

from azpaddypy import AzureDocumentIntelligence, create_azure_document_intelligence

di = create_azure_document_intelligence(
    endpoint="https://my-ai.cognitiveservices.azure.com/",
    service_name="my_service",
    enable_administration=True,  # opt in to model management
)

# Analyze from URL with a prebuilt model
result = di.analyze_document_from_url(
    model_id="prebuilt-layout",
    url_source="https://example.com/invoice.pdf",
)
print(f"Pages: {len(result.pages)}")

# Analyze from bytes
with open("contract.pdf", "rb") as f:
    result = di.analyze_document_from_bytes(model_id="prebuilt-read", document=f.read())

# Manage custom models
models = di.list_models()
model = di.get_model(model_id="my-custom-model")
di.delete_model(model_id="my-custom-model")

Speech

Azure Cognitive Services Speech with Entra ID authentication. Unlike most Azure SDKs, the Speech SDK does not accept TokenCredential directly — it requires the special aad#<resource-id>#<token> auth string. azpaddypy handles token acquisition, format, and refresh.

You must provide both the Azure region and the full ARM resource ID of the Speech / AI Services account.

from azpaddypy import AzureSpeech, create_azure_speech

speech = create_azure_speech(
    region="westeurope",
    resource_id=(
        "/subscriptions/<sub>/resourceGroups/<rg>"
        "/providers/Microsoft.CognitiveServices/accounts/<ai-services>"
    ),
    service_name="my_service",
    default_speech_synthesis_voice_name="en-US-JennyNeural",
)

# Synthesize text to in-memory bytes (server / container scenarios)
audio: bytes = speech.synthesize_text_to_bytes("Hello from azpaddypy")

# Synthesize and write directly to a file
speech.synthesize_text_to_file("Hello from azpaddypy", file_path="out.wav")

# Synthesize and play on the default speaker (interactive / local dev)
speech.synthesize_text_to_speaker("Hello from azpaddypy")

Custom synthesizers and recognizers

For full control (streaming, recognition, custom audio configs, event callbacks), get a fresh SpeechConfig and build your own:

import azure.cognitiveservices.speech as speechsdk

speech_config = speech.get_speech_config()
synthesizer = speechsdk.SpeechSynthesizer(
    speech_config=speech_config,
    audio_config=speechsdk.audio.AudioOutputConfig(filename="out.wav"),
)
synthesizer.speak_text_async("Hello from azpaddypy").get()

# Refresh AAD token on long-lived synthesizers/recognizers
# (Speech tokens expire after ~10 minutes)
speech.refresh_authorization_token(synthesizer)

Observability

All operations include OpenTelemetry spans and structured logging via Application Insights.

storage = AzureStorage(
    account_url="https://myaccount.blob.core.windows.net/",
    azure_identity=identity,
    connection_string="InstrumentationKey=...",  # App Insights
)

# Correlation tracking across distributed calls
storage.set_correlation_id("request-abc-123")

AI Foundry Tracing

AzureLogger installs two instrumentors on initialization so that traces from both direct OpenAI SDK calls and AI Foundry agent invocations flow into Application Insights and the AI Foundry Tracing UI:

  1. opentelemetry-instrumentation-openai-v2 — instruments openai.chat.completions.create(), embeddings.create(), etc. Emits OTel GenAI spans with model, token usage, latency, and optional prompt/completion content.
  2. azure.ai.projects.telemetry.AIProjectInstrumentor — instruments the OpenAI Responses API so that agent_reference calls attach agent metadata, tool-call spans, and the gen_ai.* attributes the AI Foundry Tracing UI groups traces on. Requires the AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING=true feature gate, which AzureLogger sets for you when install_ai_project_instrumentor=True (the default).

Both instrumentors are harmless for non-Foundry apps: if your code never calls responses.create() or chat.completions.create(), neither instrumentor has any runtime effect.

Configuration kwargs

Pass these to AzureLogger, create_app_logger, create_function_logger, or AzureManagementBuilder.with_logger():

Kwarg Default Effect
capture_gen_ai_content False When True, sets both OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT (honored by opentelemetry-instrumentation-openai-v2 and AIProjectInstrumentor) and AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED (honored by azure-ai-inference's AIInferenceInstrumentor) before the instrumentors activate. Off by default so prompts/completions don't ship to App Insights unexpectedly -- opt in per deployment.
install_ai_project_instrumentor True Installs AIProjectInstrumentor and sets AZURE_EXPERIMENTAL_ENABLE_GENAI_TRACING=true. Required for the AI Foundry Tracing UI to render agent metadata, tool-call spans, and gen_ai.agent.* attributes on Responses API traces.
enable_gen_ai_trace_propagation True Sets AZURE_TRACING_GEN_AI_ENABLE_TRACE_CONTEXT_PROPAGATION=true so outbound OpenAI SDK HTTP calls carry W3C traceparent/tracestate headers. Server-side spans in Foundry correlate with your client spans.
scope_logs_to_service False When True, passes logger_name=service_name to configure_azure_monitor so only your service's logger tree ships to App Insights. Default False preserves the legacy root-logger behavior (every azure.* SDK log gets exported). Flip to True to reduce noise once you've verified that any KQL queries/workbooks depending on azure.* SDK logs are retired or scoped accordingly.

Note on log_result=True: this decorator flag controls only the function-level return value as a span attribute. It does not enable GenAI prompt/completion content capture — that is the process-level capture_gen_ai_content kwarg. Earlier versions coupled these, which was racy (the env var was flipped per-function after the instrumentor had already been activated) and silently ineffective (the wrong env var name was being set for openai-v2). The coupling has been removed.

Example: tracing a chat completion

from mgmt_config import logger, ai_projects, log_execution_config

@logger.trace_function()
async def generate_summary(document_text: str) -> str:
    ai_project = ai_projects.get("aiservices")
    openai_client = ai_project.get_openai_client()

    response = openai_client.chat.completions.create(
        model="gpt-5",
        messages=[
            {"role": "system", "content": "Summarize the document."},
            {"role": "user", "content": document_text},
        ],
    )
    return response.choices[0].message.content

The trace in AI Foundry shows a parent span for generate_summary with a child chat gpt-5 span containing model, token counts, latency, and (when the logger was constructed with capture_gen_ai_content=True) the full prompt/completion content.

Example: tracing a Foundry agent invocation

# Agent trace flows through AIProjectInstrumentor -> AI Foundry Tracing UI
result = ai_projects["aiservices"].invoke_agent(
    agent_name="doc-summarizer",
    user_message="Summarize the attached document",
)

The trace shows AzureAIProject.invoke_agent with gen_ai.system=az.ai.projects, gen_ai.operation.name=invoke_agent, gen_ai.agent.name=doc-summarizer, and (via AIProjectInstrumentor) nested spans for the Responses API call, tool calls, and model invocation — all grouped under the agent in the Foundry Tracing UI.

Linking Application Insights to your Foundry project

The AI Foundry Tracing tab in the portal reads directly from the Application Insights resource linked to your Foundry project (Project → Tracing → "Manage data source"). Setting APPLICATIONINSIGHTS_CONNECTION_STRING is not enough on its own — the resource must be linked once via the portal for the Tracing UI to find the traces.

If your app wants to fetch the linked connection string at runtime instead of hand-wiring it:

from mgmt_config import ai_projects

ai = ai_projects["aiservices"]
conn_str = ai.get_application_insights_connection_string()  # returns None if not linked

This wraps azure-ai-projects' client.telemetry.get_application_insights_connection_string() and is the recommended bootstrap path when you want the logger to always target whatever App Insights is currently linked to your Foundry resource.

Feature Flags

Enable only the storage services you need:

Flag Default Service
enable_blob_storage True BlobServiceClient
enable_file_storage False ShareServiceClient (requires token_intent="backup" RBAC)
enable_queue_storage True QueueServiceClient

Dependencies

  • azure-storage-blob - Blob operations
  • azure-storage-file-share - File share operations
  • azure-storage-queue - Queue operations
  • azure-identity - Credential management
  • azure-keyvault-secrets / keys / certificates - Key Vault
  • azure-cosmos - Cosmos DB
  • azure-ai-projects - AI Foundry Projects (agents, deployments, connections, AIProjectInstrumentor for Responses API tracing)
  • azure-ai-documentintelligence - Document Intelligence (analyze, model management)
  • azure-cognitiveservices-speech - Speech (synthesis, recognition with Entra ID)
  • azure-monitor-opentelemetry - Telemetry
  • opentelemetry-instrumentation-openai-v2 - AI Foundry tracing for OpenAI SDK calls

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azpaddypy-0.9.87.tar.gz (143.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azpaddypy-0.9.87-py3-none-any.whl (90.8 kB view details)

Uploaded Python 3

File details

Details for the file azpaddypy-0.9.87.tar.gz.

File metadata

  • Download URL: azpaddypy-0.9.87.tar.gz
  • Upload date:
  • Size: 143.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for azpaddypy-0.9.87.tar.gz
Algorithm Hash digest
SHA256 3f73528bb90030a7570a0d22d6d81b4e3f2c2e5cb0c07ffadaa7184f9f03ab8f
MD5 388adcf55d375196d2309ad61cdd59c2
BLAKE2b-256 6946816092c123bf76d2edb2b4fe2f9789cd41c26e8c6ab665cf307b0176dcfb

See more details on using hashes here.

File details

Details for the file azpaddypy-0.9.87-py3-none-any.whl.

File metadata

  • Download URL: azpaddypy-0.9.87-py3-none-any.whl
  • Upload date:
  • Size: 90.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for azpaddypy-0.9.87-py3-none-any.whl
Algorithm Hash digest
SHA256 be75919d686d69931b9029085373176269a24ec29f020cba25113116a6e23eb2
MD5 357c0985b6d9346dd491a55e4735019d
BLAKE2b-256 430ce3ea39f15a1dd97a0774a99a90d85e39b47fdcafc3199114607588052403

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page