Convention for expressing language text and templates for AI language model-related uses, for example prompt templates. The format is based on TOML, and word looms are meant to be kept in resource directories for use with code invoking LLMs.
Project description
Word Loom
A convention for expressing language text and templates for AI language model-related uses, especially prompt templates. The format is based on TOML, and word looms are meant to be kept in resource directories for use with code invoking LLMs.
Why Word Loom?
When working with LLMs, we've found ourselves needing better ways to manage prompts. Traditional code doesn't quite fit—prompts are natural language, not code. But they're also not just static text—they need templating, versioning, metadata and, crucially, internationalization.
Word Loom addresses some gaps that become clear once you start building real LLM applications:
- Separation of concerns: Keep your prompts out of your code, making them easier to iterate, version, and review
- Multilingual by design: LLM prompt engineering isn't just translation—a prompt that works well in English may need significant changes to achieve similar results in Japanese or Spanish. Word Loom lets you keep all language variants together, test them independently, and maintain metadata about their performance
- Template composition: Build complex prompts from reusable pieces, with clear markers for runtime values
- Diff-friendly: TOML's structure makes it easy to track changes in version control
- Compatible with traditional i18n: Works alongside gettext, Babel, and other localization tools, while respecting the unique needs of LLM prompting
Quick Example
# prompts.toml
lang = 'en'
[system_instruction]
_ = 'You are a helpful assistant that provides concise and accurate answers.'
[greeting_multilang]
_ = 'Hello, how can I help you today?'
_fr = "Bonjour, comment puis-je vous aider aujourd'hui?"
_es = '¡Hola! ¿Cómo puedo ayudarte hoy?'
_de = 'Hallo, wie kann ich Ihnen heute helfen?'
_ja = 'こんにちは、今日はどのようにお手伝いできますか?'
[code_review_prompt]
_ = '''
Review the following code and provide feedback on:
1. Code quality and readability
2. Potential bugs or issues
3. Suggestions for improvement
Code:
{code_snippet}
'''
_m = ['code_snippet'] # Declare template variables
Use it with any LLM API (OpenAI example):
from openai import OpenAI
import wordloom
# Load your prompts
with open('prompts.toml', 'rb') as fp:
loom = wordloom.load(fp)
client = OpenAI()
# Select language based on user preference
user_lang = 'fr'
greeting = loom['greeting_multilang']
greeting_text = greeting.in_lang(user_lang) or str(greeting)
# Use with OpenAI
response = client.chat.completions.create(
model='gpt-4',
messages=[
{'role': 'system', 'content': greeting_text},
{'role': 'user', 'content': 'How does an LLM work?'}
]
)
Installation
uv pip install wordloom
Or without uv:
pip install wordloom
Documentation
See wordloom_spec.md for the complete specification, including:
- Detailed format description
- Template marker syntax
- Internationalization features
- More usage examples
- Integration patterns
LLM Prompting and internationalization
This is an under-considered area in AI prompting. When dealing with multiple languages, prompt engineering requires more than just translation. A prompt carefully tuned for English may perform very differently when naively translated to other languages. Word Loom helps by:
- Keeping all language variants in one place for easy comparison
- Allowing independent tuning of each language version
- Supporting metadata to track prompt performance across languages
- Enabling traditional i18n workflows while respecting LLM-specific needs
Contributing
Contributions welcome! This is an early-stage format, and we're interested in feedback from the community about what works and what doesn't in real-world usage.
License
- Code (Python library): Apache 2.0 - See LICENSE
- Specification (wordloom_spec.md): Creative Commons Attribution 4.0 International (CC BY 4.0) - See LICENSE-spec
The specification is under CC BY 4.0 to encourage broad adoption and derivative work while ensuring attribution. We want the format itself to be as open and reusable as possible, allowing anyone to create implementations in any language or adapt the format for their specific needs.
Acknowledgments
Created by Oori Data. Word Loom emerged from our work building multilingual LLM applications and finding gaps in existing prompt management approaches.
Related Work
Since we started work on Word Loom there have bene some other projects emerging with some degree of intersection.
- IBM's Prompt Declaration Language - A more comprehensive language for prompt engineering
- PromptL
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wordloom-0.10.0.tar.gz.
File metadata
- Download URL: wordloom-0.10.0.tar.gz
- Upload date:
- Size: 25.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dcda796a1d7553c31e3dae1e8feca5a1fbb816204a7b691110b2289a8ee73d6e
|
|
| MD5 |
dfb04630bef889779212c5442667575d
|
|
| BLAKE2b-256 |
1e002493a3c0875ba18afb83ed3533a97957d336a057c4d8b3414ed7d18dd65b
|
File details
Details for the file wordloom-0.10.0-py3-none-any.whl.
File metadata
- Download URL: wordloom-0.10.0-py3-none-any.whl
- Upload date:
- Size: 13.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5c45a81cacb83718b25fb41b693aa6d4b097c0882c6913c0c763443007ee8b79
|
|
| MD5 |
54d3b9463f13870153c8a088fd9107e0
|
|
| BLAKE2b-256 |
9b5c5063fd07ba348ccd5070c41f0919cbc71c061011e40d459611f4f84ac400
|