Skip to main content

A lightweight markup language for structured text generation

Project description

🚀 LLML - Lightweight Language Markup Language

The most elegant way to generate structured text in Python.

LLML transforms your data into beautifully formatted XML-like markup with zero fuss and maximum flexibility. Perfect for prompt engineering, configuration generation, and structured document creation.

⚡ Quick Start

from zenbase_llml import llml

# Simple values
print(llml(greeting="Hello World"))
# Output: <greeting>Hello World</greeting>

# Lists become numbered items
print(llml(tasks=["Buy milk", "Walk dog", "Code LLML"]))
# Output:
# <tasks>
#   <tasks-1>Buy milk</tasks-1>
#   <tasks-2>Walk dog</tasks-2>
#   <tasks-3>Code LLML</tasks-3>
# </tasks>

# Complex nested structures
print(llml(
    title="My Project",
    features=["Fast", "Simple", "Powerful"],
    config={"debug": True, "version": "1.0"}
))

🎯 Why LLML?

  • 🔥 Zero Learning Curve: One function, infinite possibilities
  • 🎨 Beautiful Output: Automatically formatted, properly indented
  • 🔧 Type Safe: Built with beartype for runtime type checking
  • ⚡ Lightning Fast: Minimal overhead, maximum performance
  • 🌟 Pythonic: Feels natural, works everywhere
  • ⚙️ Strict Mode: Control nested property prefixes with strict parameter

🛠️ Installation

pip install zenbase-llml

📚 Advanced Usage

Prefix Support

# Add prefix to all keys
print(llml(message="Hello", prefix="app"))
# Output: <app-message>Hello</app-message>

Multi-line Content

instructions = """
Step 1: Install LLML
Step 2: Import lml
Step 3: Create magic
"""

llml(instructions=instructions)
# Output:
# <instructions>
# Step 1: Install LLML
# Step 2: Import lml
# Step 3: Create magic
# </instructions>

Complex Nested Structures

prompt_data = {
    "system": "You are a helpful assistant",
    "rules": [
        "Be concise and clear",
        "Provide examples when helpful",
        "Ask clarifying questions"
    ],
    "context": {
        "user_level": "beginner",
        "topic": "Python programming"
    }
}

print(llml(**prompt_data))

# Example with strict mode enabled
print(llml(config={"debug": True, "timeout": 30}, strict=True))
# Output: <config>
#           <config-debug>True</config-debug>
#           <config-timeout>30</config-timeout>
#         </config>

# Example with strict mode disabled (default)
print(llml(config={"debug": True, "timeout": 30}, strict=False))
# Output: <config>
#           <debug>True</debug>
#           <timeout>30</timeout>
#         </config>

🎪 Use Cases

🤖 AI Prompt Engineering

Perfect for structuring complex prompts:

prompt = llml(
    role="Senior Python Developer",
    task="Code review the following function",
    criteria=["Performance", "Readability", "Best practices"],
    code=function_to_review
)

⚙️ Configuration Generation

Generate clean config files:

config = llml(
    database={"host": "localhost", "port": 5432},
    features=["logging", "caching", "monitoring"],
    environment="production"
)

📄 Document Structure

Create structured documents:

document = llml(
    title="API Documentation",
    sections=["Authentication", "Endpoints", "Examples"],
    metadata={"version": "2.1", "author": "Dev Team"}
)

🧪 Testing

Run the comprehensive test suite:

# Run tests
uv run pytest

# Run with coverage
uv run pytest --cov=src --cov-report=html

# Test across Python versions
tox

🌐 Python Compatibility

LLML supports Python 3.8+ and is tested against:

  • ✅ Python 3.8
  • ✅ Python 3.9
  • ✅ Python 3.10
  • ✅ Python 3.11
  • ✅ Python 3.12
  • ✅ Python 3.13

🏗️ Development

# Clone the repo
git clone https://github.com/yourusername/llml.git
cd llml/py

# Create virtual environment and install dependencies
uv venv
source .venv/bin/activate
uv pip install -e .[dev]

# Run linting and formatting
uv run ruff check .
uv run ruff format .

# Run tests
uv run pytest

🤝 Contributing

We love contributions! Whether it's:

  • 🐛 Bug reports
  • 💡 Feature requests
  • 📝 Documentation improvements
  • 🔧 Code contributions

Check out our contribution guidelines to get started.

📄 License

MIT License - see LICENSE file for details.

🌟 Star History

If LLML makes your life easier, give us a star! ⭐


Made with ❤️ for the Python community

LLML: Because beautiful markup shouldn't be hard.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zenbase_llml-0.2.0.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zenbase_llml-0.2.0-py3-none-any.whl (5.8 kB view details)

Uploaded Python 3

File details

Details for the file zenbase_llml-0.2.0.tar.gz.

File metadata

  • Download URL: zenbase_llml-0.2.0.tar.gz
  • Upload date:
  • Size: 5.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.12

File hashes

Hashes for zenbase_llml-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f06a81138ed8830d054ab60508721b142787ff54bfeec61cf7945199a602e87b
MD5 07e4a265f17247cba42345fb64260499
BLAKE2b-256 6c65cee32c1c3c69b722eeb0bf9351230d6d2b060c561319560656b361d52d58

See more details on using hashes here.

File details

Details for the file zenbase_llml-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for zenbase_llml-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2e7581550d98c88e195fe3fe1b66de50bfc1de9b256362d3f0598e443a9d8ffc
MD5 45930059d9cad22709da76081f0df82f
BLAKE2b-256 0ff7f1ff11418095a6d7fe530f7a1448cc0a1fe4795e2fe0cf28e4ce81bf03b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page