Skip to main content

A sample dataset processor for evaluating datasets.

Project description

OAI Dataset Processor

OAI Dataset Processor is a modular framework for processing large datasets using OpenAI-compatible endpoints. It provides SQL-based job persistence, worker-limited task distribution, and JSON schema validation.

Installation

pip install oai-dataset-processor

Key Features

  • Job Persistence: Uses SQLite by default, configurable to any SQLAlchemy database
  • Bulk Processing: Process multiple samples through OpenAI-compatible endpoints
  • Async Execution: Semaphore-based worker limits for efficient job execution
  • JSON Schema Validation: Enforce structured outputs using JSON schemas
  • Progress Monitoring: Live progress bar for async tasks
  • Extensibility: Easy to extend for custom storage or processing logic

Quick Start

from dataset_processor import OpenAIDatasetProcessor, create_runner_sample
from pydantic import BaseModel

# Define output schema
class SampleResponse(BaseModel):
    grade: int
    coherence: int

# Prepare samples
samples = [
    "The quick brown fox jumps over the lazy dog.",
    "What day today?",
    "The illusion of knowledge is the barrier to discovery.",
    "gpus go burrr"
]

job_samples = [
    create_runner_sample(
        job_id="job_123",
        model_name="gpt-4",
        instructions="Grade the sentence for grammar and coherence (1-10 each)",
        input_data=sample,
        output_json_schema=SampleResponse.model_json_schema(),
        sample_id=idx
    ) for idx, sample in enumerate(samples)
]

# Process samples
processor = OpenAIDatasetProcessor(
    base_url="YOUR_BASE_URL_HERE",
    api_key="YOUR_API_KEY_HERE",
    workers=20
)

processor.ingest_samples(job_samples)
results = processor.run_job("job_123")

# Export results
results.to_jsonl("output_results.jsonl")
print(processor.get_job_status("job_123"))

Configuration

  • Database: Default sqlite:///datasetrunner.sqlite. Configure via db_url in OpenAIDatasetProcessor
  • Parallelism: Set concurrent workers via the workers parameter
  • Schema Validation: Define output schemas using Pydantic models

Dependencies

  • openai
  • tqdm
  • pandas
  • sqlalchemy
  • pydantic

Contributing

Contributions welcome! Please submit PRs for features, optimizations or documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oai_dataset_processor-0.1.3.tar.gz (12.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oai_dataset_processor-0.1.3-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file oai_dataset_processor-0.1.3.tar.gz.

File metadata

  • Download URL: oai_dataset_processor-0.1.3.tar.gz
  • Upload date:
  • Size: 12.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for oai_dataset_processor-0.1.3.tar.gz
Algorithm Hash digest
SHA256 7d51ab4e47204af0f85e00062ce15000c322b09009ff60cced68a65577ac43f5
MD5 9defa0dbb5db5bb05a0f40356b3adc5a
BLAKE2b-256 7f36f7514bc66253ecca5224a3e2a46288d61e2cc4d702ed3030ef36c27f3d18

See more details on using hashes here.

File details

Details for the file oai_dataset_processor-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for oai_dataset_processor-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 dc380bb2c22983587c25559b16078c33ee7977df354b34101d5c89e9ca1a1fed
MD5 87e9979a6b3cda37a8ff8a099f75ca28
BLAKE2b-256 4655d5861201fd69c05b100093cd90e63b311335c49ac3e26427b000955e8c1c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page