Skip to main content

Lightweight pydantic wrapper prompting library to support sharing of oncology-related LLM prompts

Project description

prompt-spec

A lightweight, LinkML-aligned prompt and schema management toolkit for clinical NLP & LLM workflows.

prompt-spec provides:

  • Structured prompt templates
  • LinkML → Pydantic auto-generation for LLM output schemas
  • Validation of few-shot examples within prompts
  • A shared format for prompt libraries across collaborating groups

This toolkit is intentionally minimal and designed for stability, reuse, and strong typing when integrating with LLM libraries such as pydantic-instructor


Project Structure

prompt-spec/
├── cli.py                        ← Main command line interface
├── core/
│   └── prompt_template.py        ← Prompt class utilities
├── engines/
│   ├── __init__.py
│   └── instructor_engine.py      ← Wrapper function for integrating prompts with ontoGPT via Instructor interface
├── output_models/
│   └── condition_model.yaml      ← LinkML schema for defining model outputs - create new endpoint definitions here
├── generated_models/
│   └── (auto-generated pydantic models)
└── prompts/
    └── condition_prompt.yaml     ← This is where the actual prompt definitions exist

Installation

uv pip install -e .


Usage

1. Defining LinkML output models

Place all LinkML schemas inside prompt-spec/output_models

Example: condition_model.yaml

name: ConditionList
id: https://example.org/condition_model
prefixes:
  linkml: https://w3id.org/linkml/
default_range: string

classes:
  Condition:
    attributes:
      label: string
      verbatim_name: string
      codable_name: string
      who_diagnosed: string
      is_negated: boolean

  ConditionList:
    attributes:
      conditions:
        multivalued: true
        range: Condition

2. Generate Pydantic models from LinkML

prompt-spec build-models

This reads all *.yaml schemas in prompt-spec/output_models/ and generates Pydantic classes into prompt-spec/generated_models/.

After running this, you will have files such as:

  • prompt-spec/generated_models/ConditionList.py

Alternatively, run for just one class update at a time: prompt-spec generate-pydantic-from-linkml output_models/condition_model.yaml

3. Populate prompt template

prompt-spec create-empty-prompt condition_model prompts/condition_prompt.yaml

This produces:

output_model: condition_model
instruction: "<<< fill in your system prompt here >>>"
examples:
  example_1:
    input: "<<< sample input text here >>>"
    output:
      conditions: []

4. Validate a prompt against its template

prompt-spec validate-prompt condition_prompt.yaml

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_spec-0.1.2.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_spec-0.1.2-py3-none-any.whl (13.5 kB view details)

Uploaded Python 3

File details

Details for the file prompt_spec-0.1.2.tar.gz.

File metadata

  • Download URL: prompt_spec-0.1.2.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.2.tar.gz
Algorithm Hash digest
SHA256 abbbf20642120a8022ca17103f4662fd0066b1dfb25a81ab682cc13f7d4ebd4f
MD5 9fbfb84cbc35e5c097504beade08d5ce
BLAKE2b-256 f36ba2623d13b8be34b3affdeb6930a7d207bbcbed91a9cd45bb10106b502a68

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.2.tar.gz:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_spec-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: prompt_spec-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 13.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 068cb3b482c5b3fad55e83fad3aa24142050c97f595b9ab3d4485f89214112b4
MD5 8b583eb2f01c152a38a0e418cfafc495
BLAKE2b-256 69309f98a10438ef16dfe27e6637b579d807c1f7d9a11a6537d973ae784285cb

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.2-py3-none-any.whl:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page