A lightweight, YAML-driven framework for OpenAI-compatible LLM APIs.
Project description
Orac
Orac is a lightweight, YAML-driven framework for working with OpenAI-compatible LLM APIs. It provides clean abstractions, command-line integration, structured parameter handling, and support for both local and remote file attachments.
Features
- Prompt-as-config: Define entire LLM tasks in YAML, including prompt text, parameters, default values, model settings, and file attachments.
- Hierarchical configuration: Three-layer config system (base → prompt → runtime) with deep merging for flexible overrides.
- Templated inputs: Use
${variable}placeholders in prompt and system prompt fields. - File support: Attach local or remote files (e.g., images, documents) via
files:orfile_urls:in YAML or CLI flags. - Command-line and Python API: Use either the CLI tool or the
LLMWrapperclass in code. - Runtime configuration overrides: Override model settings, API keys, generation options, and safety filters from the CLI or programmatically.
- Structured output support: Request
application/jsonresponses or validate against a JSON Schema. - Parameter validation: Automatically convert and validate inputs by type.
- Logging: Logs all operations to file and provides optional verbose console output.
Installation
Option 1: Using requirements.txt
pip install -r requirements.txt
Option 2: Manual installation
pip install google-generativeai openai PyYAML python-dotenv loguru
Configuration
Environment Variables
Orac supports configuration through environment variables. You can either set them directly or use a .env file:
-
Copy the example environment file:
cp .env.example .env
-
Edit
.envwith your settings:# API Keys GOOGLE_API_KEY=your_google_api_key_here OPENAI_API_KEY=your_openai_api_key_here # Configuration overrides (optional) ORAC_DEFAULT_MODEL_NAME=gemini-2.0-flash ORAC_LOG_FILE=./llm.log
-
Or set environment variables directly:
export ORAC_LLM_PROVIDER="google" export GOOGLE_API_KEY="your_api_key_here" export ORAC_DEFAULT_MODEL_NAME="gemini-2.0-flash"
Choosing an LLM Provider
Orac requires explicit provider selection. You must specify which LLM provider to use either via environment variable or CLI flag:
| Provider | ORAC_LLM_PROVIDER |
API Key Environment Variable | Default Base URL |
|---|---|---|---|
| Google Gemini | google |
GOOGLE_API_KEY |
https://generativelanguage.googleapis.com/v1beta/openai/ |
| OpenAI | openai |
OPENAI_API_KEY |
https://api.openai.com/v1/ |
| Anthropic | anthropic |
ANTHROPIC_API_KEY |
https://api.anthropic.com/v1/ |
| Azure OpenAI | azure |
AZURE_OPENAI_KEY |
${AZURE_OPENAI_BASE} (user-set) |
| OpenRouter | openrouter |
OPENROUTER_API_KEY |
https://openrouter.ai/api/v1/ |
| Custom | custom |
user picks | user sets via --base-url |
Examples:
# Using Google Gemini
export ORAC_LLM_PROVIDER=google
export GOOGLE_API_KEY=your_google_api_key
python -m orac capital --country France
# Using OpenAI
export ORAC_LLM_PROVIDER=openai
export OPENAI_API_KEY=your_openai_api_key
python -m orac capital --country Spain
# Using OpenRouter (access to multiple models)
export ORAC_LLM_PROVIDER=openrouter
export OPENROUTER_API_KEY=your_openrouter_api_key
python -m orac capital --country Japan
# Using CLI flags instead of environment variables
python -m orac capital --provider google --api-key your_api_key --country Italy
# Using a custom endpoint
python -m orac capital --provider custom --base-url https://my-custom-api.com/v1/ --api-key your_key --country Germany
Configurable Environment Variables
All default settings can be overridden with environment variables using the ORAC_ prefix:
ORAC_LLM_PROVIDER- Required: LLM provider selection (google|openai|anthropic|azure|openrouter|custom)ORAC_DEFAULT_MODEL_NAME- Default LLM modelORAC_DEFAULT_PROMPTS_DIR- Directory for prompt filesORAC_DEFAULT_CONFIG_FILE- Path to config YAMLORAC_DOWNLOAD_DIR- Temp directory for file downloadsORAC_LOG_FILE- Log file location
Configuration Hierarchy
Orac uses a layered configuration system, allowing for flexible and powerful control over your prompts. Settings are resolved with the following order of precedence (where higher numbers override lower ones):
-
Base Configuration (
orac/config.yaml): The default settings for the entire project. This file is included with theoracpackage and provides sensible defaults formodel_name,generation_config, andsafety_settings. You can edit it directly in your site-packages or provide your own via a custom script. -
Prompt Configuration (
prompts/your_prompt.yaml): Any setting defined in a specific prompt's YAML file will override the base configuration. This is the primary way to customize a single task. For example, you can set a lowertemperaturefor a factual prompt or a differentmodel_namefor a complex one. -
Runtime Overrides (CLI / Python API): Settings provided directly at runtime, such as using the
--model-nameflag in the CLI or passing thegeneration_configdictionary to theLLMWrapperconstructor, will always take the highest precedence, overriding all other configurations.
Example Override
If orac/config.yaml has:
# orac/config.yaml
generation_config:
temperature: 0.7
And your prompt has:
# prompts/recipe.yaml
prompt: "Give me a recipe for ${dish}"
generation_config:
temperature: 0.2 # Override for more deterministic recipes
Running orac recipe will use a temperature of 0.2.
Running orac recipe --generation-config '{"temperature": 0.9}' will use a temperature of 0.9.
Example Usage
1. Create a YAML prompt
Save the following to prompts/capital.yaml:
prompt: "What is the capital of ${country}?"
parameters:
- name: country
description: Country name
default: France
2. Run from Python
from orac import LLMWrapper
llm = LLMWrapper("capital")
print(llm.completion()) # Defaults to France
print(llm.completion(country="Japan"))
3. Run from CLI
orac capital
orac capital --country Japan
orac capital --verbose
orac capital --info
4. Advanced examples
# Override model and config
orac capital --country "Canada" \
--model-name "gemini-2.0-flash" \
--generation-config '{"temperature": 0.4}'
# Structured JSON response
orac recipe --json-output
# Schema validation
orac capital --country "Germany" \
--response-schema schemas/capital.schema.json
# Attach local and remote files
orac paper2audio \
--file reports/report.pdf \
--file-url https://example.com/image.jpg
YAML Prompt Reference
Basic YAML
prompt: "Translate the following text: ${text}"
parameters:
- name: text
type: string
required: true
Additional Options
model_name: gemini-2.0-flash
api_key: ${OPENAI_API_KEY}
generation_config:
temperature: 0.5
max_tokens: 300
safety_settings:
- category: HARM_CATEGORY_HARASSMENT
threshold: BLOCK_NONE
response_mime_type: application/json
response_schema:
type: object
properties:
translation: { type: string }
files:
- data/*.pdf
file_urls:
- https://example.com/image.jpg
require_file: true
Supported Parameter Types
stringintfloatboollist(comma-separated values)
CLI Options
orac <prompt_name> [--parameter-name VALUE ...] [options]
Global Flags
--info: Show parameter metadata--verbose,-v: Enable verbose logging--prompts-dir DIR: Use custom prompt directory--model-name MODEL--api-key KEY--generation-config JSON--safety-settings JSON--file FILE--file-url URL--json-output--response-schema FILE--output FILE,-o
Logging
Orac provides comprehensive logging with two output modes:
Default Mode (Quiet)
- Only shows LLM responses and critical errors
- All detailed logging goes to file only
- Perfect for clean integration and scripting
Verbose Mode
- Shows detailed operation logs on console
- Includes timestamps, function names, and colorized output
- Enable with
--verboseor-vflag
Log Configuration
- File logging: All activity logged to
llm.log(configurable viaORAC_LOG_FILE) - Rotation: 10 MB max file size, 7 days retention
- Levels: DEBUG level in files, INFO+ in console (verbose mode)
Usage Examples
# Quiet mode (default) - only shows LLM response
orac capital --country France
# Verbose mode - shows detailed logging
orac capital --country Spain --verbose
# Check recent logs
tail -f llm.log
To configure logging programmatically:
from orac.logger import configure_console_logging
configure_console_logging(verbose=True)
Development & Testing
To run the test suite:
python test.py
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file orac-0.1.0.tar.gz.
File metadata
- Download URL: orac-0.1.0.tar.gz
- Upload date:
- Size: 22.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
61a1c190a90d5159ac6169c733526540f00a902142913f00b1ca65e6b52323be
|
|
| MD5 |
db644d379313cea57b16b4a5397e796f
|
|
| BLAKE2b-256 |
454c94370f976129690d887216deea0dddea6e000bbb14cf714b73b198b1f62e
|
File details
Details for the file orac-0.1.0-py3-none-any.whl.
File metadata
- Download URL: orac-0.1.0-py3-none-any.whl
- Upload date:
- Size: 22.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd32cf23ccec413cb70e9ca25d6557aa28ae362e1ac64de5015af2e971d97742
|
|
| MD5 |
f8c1928f094ecf5f8557eb27c713d945
|
|
| BLAKE2b-256 |
49534fe3d7f9014979b5563da73e41cfd85f051563a53fdf4d8591fbf27b8931
|