Skip to main content

Lightweight pydantic wrapper prompting library to support sharing of oncology-related LLM prompts

Project description

prompt-spec

A lightweight, LinkML-aligned prompt and schema management toolkit for clinical NLP & LLM workflows.

prompt-spec provides:

  • Structured prompt templates
  • LinkML → Pydantic auto-generation for LLM output schemas
  • Validation of few-shot examples within prompts
  • A shared format for prompt libraries across collaborating groups

This toolkit is intentionally minimal and designed for stability, reuse, and strong typing when integrating with LLM libraries such as pydantic-instructor


Project Structure

prompt-spec/
├── cli.py                        ← Main command line interface
├── core/
│   └── prompt_template.py        ← Prompt class utilities
├── engines/
│   ├── __init__.py
│   └── instructor_engine.py      ← Wrapper function for integrating prompts with ontoGPT via Instructor interface
├── output_models/
│   └── condition_model.yaml      ← LinkML schema for defining model outputs - create new endpoint definitions here
├── generated_models/
│   └── (auto-generated pydantic models)
└── prompts/
    └── condition_prompt.yaml     ← This is where the actual prompt definitions exist

Installation

uv pip install -e .


Usage

1. Defining LinkML output models

Place all LinkML schemas inside prompt-spec/output_models

Example: condition_model.yaml

name: ConditionList
id: https://example.org/condition_model
prefixes:
  linkml: https://w3id.org/linkml/
default_range: string

classes:
  Condition:
    attributes:
      label: string
      verbatim_name: string
      codable_name: string
      who_diagnosed: string
      is_negated: boolean

  ConditionList:
    attributes:
      conditions:
        multivalued: true
        range: Condition

2. Generate Pydantic models from LinkML

prompt-spec build-models

This reads all *.yaml schemas in prompt-spec/output_models/ and generates Pydantic classes into prompt-spec/generated_models/.

After running this, you will have files such as:

  • prompt-spec/generated_models/ConditionList.py

Alternatively, run for just one class update at a time: prompt-spec generate-pydantic-from-linkml output_models/condition_model.yaml

3. Populate prompt template

prompt-spec create-empty-prompt condition_model prompts/condition_prompt.yaml

This produces:

output_model: condition_model
instruction: "<<< fill in your system prompt here >>>"
examples:
  example_1:
    input: "<<< sample input text here >>>"
    output:
      conditions: []

4. Validate a prompt against its template

prompt-spec validate-prompt condition_prompt.yaml

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_spec-0.1.3.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_spec-0.1.3-py3-none-any.whl (13.7 kB view details)

Uploaded Python 3

File details

Details for the file prompt_spec-0.1.3.tar.gz.

File metadata

  • Download URL: prompt_spec-0.1.3.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.3.tar.gz
Algorithm Hash digest
SHA256 75e8a48965828ee4abebc243ec37c5d8622a390dd8f35d4727598b620eb71c57
MD5 31b5f715c34e8084809588ff3fc16ae4
BLAKE2b-256 454a855df4ec7cc08a208b2c4868c7a052073792ee49fce90d433cc48f78edb1

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.3.tar.gz:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_spec-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: prompt_spec-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 13.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 46a5118571a323f5c0a034462cf7d4290ee0a69a1b8ed03f7ef7520f37a9dcb1
MD5 7edbe3f389fc91b7f7a6749dc6ec6108
BLAKE2b-256 d57b1e9e22dbcab372b8c0820008d39d9b00e5fc22d643f9372cf8af286d4cf7

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page