Skip to main content

Lightweight pydantic wrapper prompting library to support sharing of oncology-related LLM prompts

Project description

prompt-spec

A lightweight, LinkML-aligned prompt and schema management toolkit for clinical NLP & LLM workflows.

prompt-spec provides:

  • Structured prompt templates
  • LinkML → Pydantic auto-generation for LLM output schemas
  • Validation of few-shot examples within prompts
  • A shared format for prompt libraries across collaborating groups

This toolkit is intentionally minimal and designed for stability, reuse, and strong typing when integrating with LLM libraries such as pydantic-instructor


Project Structure

prompt-spec/
├── cli.py                        ← Main command line interface
├── core/
│   └── prompt_template.py        ← Prompt class utilities
├── engines/
│   ├── __init__.py
│   └── instructor_engine.py      ← Wrapper function for integrating prompts with ontoGPT via Instructor interface
├── output_models/
│   └── condition_model.yaml      ← LinkML schema for defining model outputs - create new endpoint definitions here
├── generated_models/
│   └── (auto-generated pydantic models)
└── prompts/
    └── condition_prompt.yaml     ← This is where the actual prompt definitions exist

Installation

uv pip install -e .


Usage

1. Defining LinkML output models

Place all LinkML schemas inside prompt-spec/output_models

Example: condition_model.yaml

name: ConditionList
id: https://example.org/condition_model
prefixes:
  linkml: https://w3id.org/linkml/
default_range: string

classes:
  Condition:
    attributes:
      label: string
      verbatim_name: string
      codable_name: string
      who_diagnosed: string
      is_negated: boolean

  ConditionList:
    attributes:
      conditions:
        multivalued: true
        range: Condition

2. Generate Pydantic models from LinkML

prompt-spec build-models

This reads all *.yaml schemas in prompt-spec/output_models/ and generates Pydantic classes into prompt-spec/generated_models/.

After running this, you will have files such as:

  • prompt-spec/generated_models/ConditionList.py

Alternatively, run for just one class update at a time: prompt-spec generate-pydantic-from-linkml output_models/condition_model.yaml

3. Populate prompt template

prompt-spec create-empty-prompt condition_model prompts/condition_prompt.yaml

This produces:

output_model: condition_model
instruction: "<<< fill in your system prompt here >>>"
examples:
  example_1:
    input: "<<< sample input text here >>>"
    output:
      conditions: []

4. Validate a prompt against its template

prompt-spec validate-prompt condition_prompt.yaml

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_spec-0.1.4.tar.gz (12.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_spec-0.1.4-py3-none-any.whl (14.3 kB view details)

Uploaded Python 3

File details

Details for the file prompt_spec-0.1.4.tar.gz.

File metadata

  • Download URL: prompt_spec-0.1.4.tar.gz
  • Upload date:
  • Size: 12.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.4.tar.gz
Algorithm Hash digest
SHA256 8414e5a6942d213f6405d66487d0cf182bf7657181b24cd7b6fe7e8f2446caf2
MD5 fdc8a1602a19566b73bdffcaf1fabcad
BLAKE2b-256 242e904ea01123027bfb648ddbd9b651beb5d2836836f7f12b33e4f938c282de

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.4.tar.gz:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_spec-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: prompt_spec-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 14.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e12e0cac5a93367974023c37fdace258cc729efd732a20b49a683e477f26f70e
MD5 f3bfc33504352ba152778a19ffca7c37
BLAKE2b-256 2096c175c088a1a40611a49ad36ffcce021a170dbee546b292d3564d883df32a

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.4-py3-none-any.whl:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page