Skip to main content

CLI for OpenAI Structured Output

Project description

ostruct-cli

PyPI version Python Versions Documentation Status CI License: MIT

Command-line interface for working with OpenAI models and structured output, powered by the openai-structured library.

Features

  • Generate structured output from natural language using OpenAI models
  • Rich template system for defining output schemas
  • Automatic token counting and context window management
  • Streaming support for real-time output
  • Caching system for cost optimization
  • Secure handling of sensitive data

Installation

pip install ostruct-cli

Quick Start

  1. Set your OpenAI API key:
export OPENAI_API_KEY=your-api-key
  1. Create a task template file task.j2:
Extract information about the person: {{ stdin }}
  1. Create a schema file schema.json:
{
  "type": "object",
  "properties": {
    "name": {
      "type": "string",
      "description": "The person's full name"
    },
    "age": {
      "type": "integer",
      "description": "The person's age"
    },
    "occupation": {
      "type": "string",
      "description": "The person's job or profession"
    }
  },
  "required": ["name", "age", "occupation"]
}
  1. Run the CLI:
ostruct run task.j2 schema.json

Or with more options:

ostruct run task.j2 schema.json \
  -f content input.txt \
  -m gpt-4o \
  --sys-prompt "You are an expert content analyzer"

Output:

{
  "name": "John Smith",
  "age": 35,
  "occupation": "software engineer"
}

About Template Files

Template files use the .j2 extension to indicate they contain Jinja2 template syntax. This convention:

  • Enables proper syntax highlighting in most editors
  • Makes it clear the file contains template logic
  • Follows industry standards for Jinja2 templates

CLI Options

The CLI revolves around a single subcommand called run. Basic usage:

ostruct run <TASK_TEMPLATE> <SCHEMA_FILE> [OPTIONS]

Common options include:

  • File & Directory Inputs:

    • -f <NAME> <PATH>: Map a single file to a variable name
    • -d <NAME> <DIR>: Map a directory to a variable name
    • -p <NAME> <PATTERN>: Map files matching a glob pattern to a variable name
    • -R, --recursive: Enable recursive directory/pattern scanning
  • Variables:

    • -V name=value: Define a simple string variable
    • -J name='{"key":"value"}': Define a JSON variable
  • Model Parameters:

    • -m, --model MODEL: Select the OpenAI model (supported: gpt-4o, o1, o3-mini)
    • --temperature FLOAT: Set sampling temperature (0.0-2.0)
    • --max-output-tokens INT: Set maximum output tokens
    • --top-p FLOAT: Set top-p sampling parameter (0.0-1.0)
    • --frequency-penalty FLOAT: Adjust frequency penalty (-2.0-2.0)
    • --presence-penalty FLOAT: Adjust presence penalty (-2.0-2.0)
    • --reasoning-effort [low|medium|high]: Control model reasoning effort
  • System Prompt:

    • --sys-prompt TEXT: Provide system prompt directly
    • --sys-file FILE: Load system prompt from file
    • --ignore-task-sysprompt: Ignore system prompt in template frontmatter
  • API Configuration:

    • --api-key KEY: OpenAI API key (defaults to OPENAI_API_KEY env var)
    • --timeout FLOAT: API timeout in seconds (default: 60.0)

Debug Options

  • --debug-validation: Show detailed schema validation debugging
  • --debug-openai-stream: Enable low-level debug output for OpenAI streaming
  • --progress-level {none,basic,detailed}: Set progress reporting level
    • none: No progress indicators
    • basic: Show key operation steps (default)
    • detailed: Show all steps with additional info
  • --show-model-schema: Display the generated Pydantic model schema
  • --verbose: Enable verbose logging
  • --dry-run: Validate and render template without making API calls
  • --no-progress: Disable all progress indicators

All debug and error logs are written to:

  • ~/.ostruct/logs/ostruct.log: General application logs
  • ~/.ostruct/logs/openai_stream.log: OpenAI streaming operations logs

For more detailed documentation and examples, visit our documentation.

Development

To contribute or report issues, please visit our GitHub repository.

Development Setup

  1. Clone the repository:
git clone https://github.com/yanivgolan/ostruct.git
cd ostruct
  1. Install Poetry if you haven't already:
curl -sSL https://install.python-poetry.org | python3 -
  1. Install dependencies:
poetry install
  1. Install openai-structured in editable mode:
poetry add --editable ../openai-structured  # Adjust path as needed
  1. Run tests:
poetry run pytest

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ostruct_cli-0.5.0.tar.gz (79.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ostruct_cli-0.5.0-py3-none-any.whl (95.4 kB view details)

Uploaded Python 3

File details

Details for the file ostruct_cli-0.5.0.tar.gz.

File metadata

  • Download URL: ostruct_cli-0.5.0.tar.gz
  • Upload date:
  • Size: 79.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for ostruct_cli-0.5.0.tar.gz
Algorithm Hash digest
SHA256 ba378caeddef370da9bb0bf458557b024f039465fe62d19a038ac9b59a13d3b1
MD5 be3d5a933501f09cd3ab983b74ca8040
BLAKE2b-256 9e9e20f93a176dbd44b1110840897f46583aed2e7d833b58081e5e132f148fe2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ostruct_cli-0.5.0.tar.gz:

Publisher: publish.yml on yaniv-golan/ostruct

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ostruct_cli-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: ostruct_cli-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 95.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for ostruct_cli-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7d94ff1971349cb5c60181f7efcf513fdaa4889fb8cecb417c0eec1000e7aa23
MD5 bc7ffd7172c87ef8f299d7513cd60921
BLAKE2b-256 8043c57fc90a63ac7dd8df8425fc2fba94967f21ea724785cf9237cc69c87a7f

See more details on using hashes here.

Provenance

The following attestation bundles were made for ostruct_cli-0.5.0-py3-none-any.whl:

Publisher: publish.yml on yaniv-golan/ostruct

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page