Skip to main content

A sample dataset processor for evaluating datasets.

Project description

OAI Dataset Processor

OAI Dataset Processor is a modular framework for processing large datasets using OpenAI-compatible endpoints. It provides SQL-based job persistence, worker-limited task distribution, and JSON schema validation.

Installation

pip install oai-dataset-processor

Key Features

  • Job Persistence: Uses SQLite by default, configurable to any SQLAlchemy database
  • Bulk Processing: Process multiple samples through OpenAI-compatible endpoints
  • Async Execution: Semaphore-based worker limits for efficient job execution
  • JSON Schema Validation: Enforce structured outputs using JSON schemas
  • Progress Monitoring: Live progress bar for async tasks
  • Extensibility: Easy to extend for custom storage or processing logic

Quick Start

from dataset_processor import OpenAIDatasetProcessor, create_runner_sample
from pydantic import BaseModel

# Define output schema
class SampleResponse(BaseModel):
    grade: int
    coherence: int

# Prepare samples
samples = [
    "The quick brown fox jumps over the lazy dog.",
    "What day today?",
    "The illusion of knowledge is the barrier to discovery.",
    "gpus go burrr"
]

job_samples = [
    create_runner_sample(
        job_id="job_123",
        model_name="gpt-4",
        instructions="Grade the sentence for grammar and coherence (1-10 each)",
        input_data=sample,
        output_json_schema=SampleResponse.model_json_schema(),
        sample_id=idx
    ) for idx, sample in enumerate(samples)
]

# Process samples
processor = OpenAIDatasetProcessor(
    base_url="YOUR_BASE_URL_HERE",
    api_key="YOUR_API_KEY_HERE",
    workers=20
)

processor.ingest_samples(job_samples)
results = processor.run_job("job_123")

# Export results
results.to_jsonl("output_results.jsonl")
print(processor.get_job_status("job_123"))

Configuration

  • Database: Default sqlite:///datasetrunner.sqlite. Configure via db_url in OpenAIDatasetProcessor
  • Parallelism: Set concurrent workers via the workers parameter
  • Schema Validation: Define output schemas using Pydantic models

Dependencies

  • openai
  • tqdm
  • pandas
  • sqlalchemy
  • pydantic

Contributing

Contributions welcome! Please submit PRs for features, optimizations or documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oai_dataset_processor-0.1.2.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oai_dataset_processor-0.1.2-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file oai_dataset_processor-0.1.2.tar.gz.

File metadata

  • Download URL: oai_dataset_processor-0.1.2.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.7

File hashes

Hashes for oai_dataset_processor-0.1.2.tar.gz
Algorithm Hash digest
SHA256 411b051adfa26bdb9c082e0ef6b76087e127af1c74e05a1aa90d106d09262dfe
MD5 2ccd87c95330f300711aa0f09d116b43
BLAKE2b-256 654a46be4a7d954bf0fbd58def39cc736f9aa1a89229fb2d2a4ddef846710034

See more details on using hashes here.

File details

Details for the file oai_dataset_processor-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for oai_dataset_processor-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c97e0575e5d21c90108510bd311c4b12cd3c5ead4e59ff164a9c24a891a50d16
MD5 76406bf45a196efb38729a165370c510
BLAKE2b-256 7afe59a90f53c9ebb6cc737901433254a06c2c6131e83ddf61c59f5c8c271d9a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page