DTL (Domain Transport Language) Parser - A type-safe data format with enum validation, autofix, and multi-format export
Project description
🧬 DTL Parser
Domain Transport Language SDK for Python
🇦🇪 Product of Dubai, Made in Emirates
✨ Features
- Full DTL Parsing - Parse DTL files and strings with complete type support
- Enum Validation - Exact-match validation with up to 10 values per field
- Autofix / Magic Corrections - Automatically fix common errors
- Fuzzy Matching - Smart typo suggestions using Levenshtein distance
- JSON/CSV Export - Convert tables to JSON or CSV format
- Programmatic Creation - Create tables and documents in code
- Type-Safe - Full type hints for IDE support
📦 Installation
pip install dtl-parser
🚀 Quick Start
Parse a DTL File
from dtl_parser import DTLParser
parser = DTLParser()
doc = parser.parse_file("data.dtl")
# Access tables
for table in doc.tables:
print(f"Table: {table.name}, Rows: {len(table.rows)}")
for row in table.rows:
print(row)
Parse a DTL String
from dtl_parser import DTLParser
dtl_content = """
@dtlv1.0^dtWEB^pMyApp^c0^s0^w0^hash
@sec^none^0x0^none^0
USERS|id:s,name:s,role:e(admin,user,guest),status:e(active,inactive)|2|S0|W0|C0
U001|Alice|admin|active
U002|Bob|user|active
"""
parser = DTLParser()
doc = parser.parse(dtl_content)
users = doc.get_table("USERS")
for row in users.rows:
print(f"{row['name']} is a {row['role']}")
✅ Validation
from dtl_parser import DTLParser
parser = DTLParser()
doc = parser.parse_file("data.dtl")
# Validate and get errors
errors = doc.validate()
for error in errors:
print(f"[{error.severity.value}] Line {error.line}: {error.message}")
if error.suggestion:
print(f" Suggestion: {error.suggestion}")
🔧 Autofix
Automatically fix common errors like typos, format issues, and row count mismatches:
from dtl_parser import DTLParser
dtl_with_errors = """
@dtlv1.0^dtWEB^pDemo^c0^s0^w0^hash
@sec^none^0x0^none^0
users|id:s,status:e(active,inactive,pending)|2|S0|W0|C0
U001|actve
U002|pendng
U003|inactive
"""
parser = DTLParser()
doc = parser.parse(dtl_with_errors)
# Apply autofix
fixed_doc, changes = doc.autofix()
for change in changes:
print(f"✨ {change}")
# Output:
# ✨ Fixed table name case: users → USERS
# ✨ Fixed row count in USERS: 2 → 3
# ✨ Fixed USERS[0].status: 'actve' → 'active'
# ✨ Fixed USERS[1].status: 'pendng' → 'pending'
What Autofix Corrects
| Issue | Before | After |
|---|---|---|
| Row count mismatch | |2| (actual: 3) |
|3| |
| Table name case | users |
USERS |
| Boolean values | true, yes, on |
1 |
| Boolean values | false, no, off |
0 |
| Enum typos | pendng |
pending |
| Date format (US) | 01/15/2025 |
2025-01-15 |
| Date format (EU) | 15.01.2025 |
2025-01-15 |
| Timestamp format | 2025-01-15 10:30:00 |
2025-01-15T10:30:00Z |
🏭 Create Tables Programmatically
from dtl_parser import create_table, create_document
# Create a table with enum fields
orders = create_table("ORDERS", {
"id": "s",
"customer": "s",
"status": "e(pending,confirmed,shipped,delivered)",
"priority": "e(low,medium,high,urgent)",
"total": "f",
"created": "D"
})
# Add rows
orders.add_row({
"id": "ORD-001",
"customer": "Alice",
"status": "pending",
"priority": "high",
"total": 99.99,
"created": "2025-01-15"
})
orders.add_row({
"id": "ORD-002",
"customer": "Bob",
"status": "shipped",
"priority": "medium",
"total": 149.50,
"created": "2025-01-14"
})
# Create document and add table
doc = create_document(domain="dtWEB")
doc.add_table(orders)
# Export to DTL
print(doc.to_dtl())
📤 Export Formats
To JSON
# Table to JSON
json_data = table.to_json()
print(json.dumps(json_data, indent=2))
# Document to JSON
doc_json = doc.to_json()
To CSV
csv_data = table.to_csv()
print(csv_data)
# With custom delimiter
tsv_data = table.to_csv(delimiter="\t")
Back to DTL
dtl_string = doc.to_dtl()
print(dtl_string)
📊 DTL Data Types
| Type | Code | Description | Example |
|---|---|---|---|
| String | s |
Text value | Hello World |
| Integer | i |
Whole number | 42 |
| Float | f |
Decimal number | 3.14 |
| Boolean | b |
0 or 1 | 1 |
| Date | D |
YYYY-MM-DD | 2025-01-15 |
| Timestamp | T |
ISO 8601 | 2025-01-15T10:30:00Z |
| UUID | u |
Unique ID | 550e8400-e29b-... |
| JSON | j |
Embedded JSON | {"key":"value"} |
| Array | a(x) |
List of type x | a,b,c |
| Enum | e(...) |
Exact match | e(low,medium,high) |
🔍 Fuzzy Matching
The find_closest_match function uses Levenshtein distance to suggest corrections:
from dtl_parser import find_closest_match
options = ["pending", "processing", "shipped", "delivered"]
# Find closest match
suggestion = find_closest_match("pendng", options)
print(suggestion) # "pending"
suggestion = find_closest_match("shiped", options)
print(suggestion) # "shipped"
🏥 Domain Examples
Healthcare (dtHC)
patients = create_table("PATIENTS", {
"mrn": "s",
"name": "s",
"gender": "e(M,F,O)",
"blood_type": "e(A+,A-,B+,B-,AB+,AB-,O+,O-)",
"status": "e(active,discharged,deceased)"
}, security="S2") # HIPAA compliant
Finance (dtFN)
transactions = create_table("TRANSACTIONS", {
"id": "u",
"type": "e(deposit,withdrawal,transfer)",
"currency": "e(USD,EUR,GBP,AED)",
"amount": "f",
"status": "e(pending,completed,failed)"
}, web3="W1") # With blockchain signature
📜 License
MIT License - see LICENSE file.
👤 Author
Padam Sundar Kafle
Lead Architect, DTL & AlifZetta
🔗 Links
- Website: dtlaz.org
- Documentation: dtlaz.org/docs
- GitHub: github.com/AlifZetta/dtl-parser-python
🇦🇪 Product of Dubai, Made in Emirates
Born in Dubai, Built for the World
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dtl_parser-1.3.1.tar.gz.
File metadata
- Download URL: dtl_parser-1.3.1.tar.gz
- Upload date:
- Size: 19.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d64b7afdec5cf8e4a19aece02e234fa929ef834da730271e68a2e25665224afc
|
|
| MD5 |
7ed3042bc2b7b81ce908ca6c7964cc74
|
|
| BLAKE2b-256 |
63039bced27dce66572ab33c3888bbfcee9e25d9d35bf4170a2eae874a58fc8c
|
File details
Details for the file dtl_parser-1.3.1-py3-none-any.whl.
File metadata
- Download URL: dtl_parser-1.3.1-py3-none-any.whl
- Upload date:
- Size: 14.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
34129cad84383f22118c19de52e676e9b07e8e9e2bd0b5a95026b3ec79f9c57b
|
|
| MD5 |
66f25ea43ad12a5abdaecba19e55f73b
|
|
| BLAKE2b-256 |
5f7105f0ecd7e3bf18f445a58cbf04a57b40d3e2078868720e0f7ce6a5ebc553
|