Skip to main content

Lightweight pydantic wrapper prompting library to support sharing of oncology-related LLM prompts

Project description

prompt-spec

A lightweight, LinkML-aligned prompt and schema management toolkit for clinical NLP & LLM workflows.

prompt-spec provides:

  • Structured prompt templates
  • LinkML → Pydantic auto-generation for LLM output schemas
  • Validation of few-shot examples within prompts
  • A shared format for prompt libraries across collaborating groups

This toolkit is intentionally minimal and designed for stability, reuse, and strong typing when integrating with LLM libraries such as pydantic-instructor


Project Structure

prompt-spec/
├── cli.py                        ← Main command line interface
├── core/
│   └── prompt_template.py        ← Prompt class utilities
├── engines/
│   ├── __init__.py
│   └── instructor_engine.py      ← Wrapper function for integrating prompts with ontoGPT via Instructor interface
├── output_models/
│   └── condition_model.yaml      ← LinkML schema for defining model outputs - create new endpoint definitions here
├── generated_models/
│   └── (auto-generated pydantic models)
└── prompts/
    └── condition_prompt.yaml     ← This is where the actual prompt definitions exist

Installation

uv pip install -e .


Usage

1. Defining LinkML output models

Place all LinkML schemas inside prompt-spec/output_models

Example: condition_model.yaml

name: ConditionList
id: https://example.org/condition_model
prefixes:
  linkml: https://w3id.org/linkml/
default_range: string

classes:
  Condition:
    attributes:
      label: string
      verbatim_name: string
      codable_name: string
      who_diagnosed: string
      is_negated: boolean

  ConditionList:
    attributes:
      conditions:
        multivalued: true
        range: Condition

2. Generate Pydantic models from LinkML

prompt-spec build-models

This reads all *.yaml schemas in prompt-spec/output_models/ and generates Pydantic classes into prompt-spec/generated_models/.

After running this, you will have files such as:

  • prompt-spec/generated_models/ConditionList.py

Alternatively, run for just one class update at a time: prompt-spec generate-pydantic-from-linkml output_models/condition_model.yaml

3. Populate prompt template

prompt-spec create-empty-prompt condition_model prompts/condition_prompt.yaml

This produces:

output_model: condition_model
instruction: "<<< fill in your system prompt here >>>"
examples:
  example_1:
    input: "<<< sample input text here >>>"
    output:
      conditions: []

4. Validate a prompt against its template

prompt-spec validate-prompt condition_prompt.yaml

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_spec-0.1.1.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_spec-0.1.1-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file prompt_spec-0.1.1.tar.gz.

File metadata

  • Download URL: prompt_spec-0.1.1.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.1.tar.gz
Algorithm Hash digest
SHA256 08efe4c3fed87530db14f42403165ba476a53da89a1350555a98e15b8e5d2749
MD5 78a7c5192f3df3d1b2a70ad13c4bf98d
BLAKE2b-256 1cc757212cbcfc63e1a9f5d507cee769badfb004e2b657d7fa14102cc6e64214

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.1.tar.gz:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_spec-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: prompt_spec-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for prompt_spec-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5312646da8f16f33fc1f639d22933371c7696cc02f978c919bb74cd7ce2d37b2
MD5 edbcf4cf67b3436a72ecd694177d6b88
BLAKE2b-256 87a38bc36e1bf7a99e2a83a5d7ee80c25a6ff6e85fc20312ca8b887f5f192793

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_spec-0.1.1-py3-none-any.whl:

Publisher: python-publish.yml on AustralianCancerDataNetwork/prompt-spec

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page