Skip to main content

DealerTower Python Framework: reusable building‑blocks for DealerTower services

Project description

DealerTower Python Framework (dtpyfw)

Python Version Code Style Type Checked License

DealerTower Python Framework (dtpyfw) is a comprehensive, production-ready framework providing reusable building blocks for microservices. It offers modular sub-packages for API development, database orchestration, caching, messaging, storage, task scheduling, and more—all with full type safety and consistent interfaces.

This library follows Python packaging standards including PEP 561 for type checking support, ensuring excellent IDE integration and compile-time type validation.


🚀 Installation

Requires Python 3.11 or newer.

Base Installation

pip install dtpyfw

The base installation includes:

  • Core utilities: Environment management, async bridging, validation, hashing, chunking, retry logic, and more
  • Logging system: Structured logging with multiple handlers and formatters

Development Installation

Using Poetry (recommended for contributors):

poetry install -E all

Querying Package Version

import dtpyfw
print(dtpyfw.__version__)  # e.g., "0.6.13  "

Optional Extras

Install specific features as needed. Extras can be combined (e.g., pip install dtpyfw[api,db,redis]).

Extra Description Key Features Install Command
api FastAPI application framework Application wrapper, middleware, CORS, exception handling, routing pip install dtpyfw[api]
db SQLAlchemy database orchestration Sync/async engines, connection pooling, health checks, search utilities, PostgreSQL support pip install dtpyfw[db]
db-mysql MySQL database support MySQL-specific drivers (PyMySQL, aiomysql) pip install dtpyfw[db-mysql]
bucket S3-compatible object storage Upload, download, duplicate, delete objects; MinIO & AWS S3 support pip install dtpyfw[bucket]
redis Redis client & caching Connection management, health checks, function memoization, data compression pip install dtpyfw[redis]
redis_streamer Redis Streams messaging Producer/consumer for Redis Streams with sync & async support pip install dtpyfw[redis_streamer]
kafka Kafka messaging Producer/consumer wrappers with error handling and logging pip install dtpyfw[kafka]
opensearch OpenSearch/Elasticsearch integration Client wrapper, configuration management, health monitoring, cluster operations pip install dtpyfw[opensearch]
worker Celery task management Task registry, periodic scheduling, worker configuration, Redis backend pip install dtpyfw[worker]
ftp FTP/SFTP client Unified interface for FTP & SFTP operations with context manager support pip install dtpyfw[ftp]
encrypt Cryptography utilities JWT encryption/decryption, password hashing (bcrypt, argon2) pip install dtpyfw[encrypt]
all All features above Complete framework installation pip install dtpyfw[all]

Common Installation Profiles

# Typical API microservice (FastAPI + Database + Redis)
pip install dtpyfw[api,db,redis]

# Background worker service (Celery + Database + Redis)
pip install dtpyfw[worker,db,redis]

# Data processing service (S3 + Database + FTP)
pip install dtpyfw[bucket,db,ftp]

# Search-enabled service (API + Database + OpenSearch + Redis)
pip install dtpyfw[api,db,opensearch,redis]

# Full-featured microservice
pip install dtpyfw[all]

📚 Documentation

Available on GitHub

Comprehensive documentation is maintained in the GitHub repository:

  • how-to-use.md: AI-friendly quick reference guide with all import paths, configuration patterns, and usage examples
  • docs/: Detailed module documentation for every component

Accessing Documentation:

# Clone or view the repository
git clone https://github.com/datgate/dtpyfw.git

# Browse documentation
cd dtpyfw/docs

💡 Pro Tip for AI Assistants: Copy how-to-use.md from the repository to your project root so AI coding assistants can reference it directly. This guide contains all import paths, configuration patterns, and usage examples for optimal AI assistance.

Using help() for In-Code Documentation:

All modules have comprehensive docstrings accessible via Python's built-in help() function:

from dtpyfw.db import DatabaseConfig
help(DatabaseConfig)  # View complete documentation

from dtpyfw.redis import RedisInstance
help(RedisInstance)  # View Redis documentation

📦 Feature Overview

Core Utilities (Included in Base)

The dtpyfw.core module provides foundational utilities used across all other modules:

  • Async Bridge (async.py): Execute async functions from sync contexts
  • Chunking (chunking.py): Split iterables and files into manageable chunks
  • Environment (env.py): Type-safe environment variable access with validation
  • Exceptions (exception.py): Structured exception handling and serialization
  • File/Folder (file_folder.py): File system operations and path utilities
  • Hashing (hashing.py): Consistent hash generation for data structures
  • JSON Encoding (jsonable_encoder.py): Serialize complex Python objects to JSON
  • Request Utilities (request.py): HTTP request helpers and decorators
  • Retry Logic (retry.py): Configurable retry mechanisms with exponential backoff
  • Safe Access (safe_access.py): Safely access nested data structures
  • Singleton (singleton.py): Thread-safe singleton pattern implementation
  • Slug Generation (slug.py): Create URL-safe slugs from strings
  • URL Utilities (url.py): URL parsing and manipulation helpers
  • Validation (validation.py): Common validation functions for data integrity

📖 View Core Documentation

Logging System (Included in Base)

The dtpyfw.log module provides production-ready structured logging:

  • Centralized Configuration: Configure all logging through LogConfig class
  • Multiple Handlers: Console output, file rotation, API logging, Kafka streaming
  • Custom Formatters: JSON formatting, colored console output, structured data
  • Celery Integration: Specialized logging for Celery worker contexts
  • Request Footprinting: Track requests across distributed services
  • Performance Monitoring: Built-in timing and resource usage tracking

📖 View Logging Documentation

API Development (dtpyfw.api)

Build production-ready FastAPI applications with pre-configured best practices:

  • Application Wrapper: Clean OOP interface for FastAPI configuration
  • Middleware Stack: Timer middleware, error handling, custom middleware support
  • CORS Management: Flexible CORS configuration with sensible defaults
  • Exception Handling: Standardized HTTP and validation error responses
  • Router Organization: Modular router registration with prefix support
  • Sub-Applications: Mount multiple FastAPI apps as microservice modules
  • Session Management: Optional session middleware integration
  • Compression: Automatic gzip compression for responses
  • OpenAPI Integration: Automatic Swagger UI and ReDoc documentation

📖 View API Documentation

Database Management (dtpyfw.db)

Comprehensive SQLAlchemy integration with sync/async support:

  • Connection Orchestration: Automatic engine and session management
  • Sync & Async Support: Seamless switching between synchronous and asynchronous operations
  • Read/Write Splitting: Separate connections for read and write operations
  • Connection Pooling: Configurable connection pools with health monitoring
  • Health Checks: Built-in database health check endpoints
  • Search Utilities: Advanced query builders for filtering, sorting, and pagination
  • SSL/TLS Support: Secure database connections with certificate validation
  • Context Managers: Safe session handling with automatic cleanup
  • FastAPI Integration: Dependency injection patterns for FastAPI routes

📖 View Database Documentation

S3-Compatible Storage (dtpyfw.bucket)

Simple interface for S3-compatible object storage:

  • Unified API: Works with AWS S3, MinIO, DigitalOcean Spaces, and other S3-compatible services
  • File Operations: Upload, download, duplicate, delete objects
  • Metadata Management: Get object info, check existence, list buckets
  • Stream Support: Handle large files with streaming uploads/downloads
  • Error Handling: Comprehensive error handling with detailed logging
  • Flexible Configuration: Support for custom endpoints and credentials

📖 View Bucket Documentation

Redis Integration (dtpyfw.redis)

High-performance Redis client with caching utilities:

  • Connection Management: Thread-safe Redis connection pools
  • Health Monitoring: Redis health checks for readiness probes
  • Function Caching: Automatic memoization with decorator pattern
  • Output Monitoring: Watch function outputs for changes to trigger webhooks
  • Data Compression: zlib compression to minimize memory usage
  • Conditional Caching: Cache based on specific argument conditions
  • TTL Support: Configurable expiration for cached values
  • Sync & Async: Support for both synchronous and asynchronous operations
  • Type Safety: Full type annotations for IDE integration

📖 View Redis Documentation

Kafka Messaging (dtpyfw.kafka)

Simplified Kafka producer and consumer wrappers:

  • Producer: High-level message production with automatic JSON encoding
  • Consumer: Simplified message consumption with error handling
  • Configuration: Clean configuration interface with connection management
  • Logging Integration: Built-in logging for all Kafka operations
  • Error Handling: Graceful error handling with retry support

📖 View Kafka Documentation

OpenSearch Integration (dtpyfw.opensearch)

OpenSearch and Elasticsearch client wrapper with configuration management:

  • Client Wrapper: Managed OpenSearch client with automatic connection testing
  • Builder Configuration: Fluent interface for connection, authentication, and SSL settings
  • Health Monitoring: Built-in health checks and cluster connectivity verification
  • Authentication Support: Username/password and mutual TLS authentication
  • SSL Configuration: Flexible SSL/TLS settings with custom certificate support
  • Connection Pooling: Automatic connection management and retry logic
  • Cluster Operations: Access to full opensearch-py client capabilities
  • Error Handling: Comprehensive error handling with structured logging

📖 View OpenSearch Documentation

Celery Workers (dtpyfw.worker)

Streamlined Celery task management and scheduling:

  • Task Registry: Centralized task registration and routing
  • Queue Management: Flexible queue assignment for task distribution
  • Periodic Scheduling: Cron-style and interval-based task scheduling
  • Worker Builder: Simple worker configuration and initialization
  • Redis Backend: Integrated Redis support for result backend and broker
  • Beat Integration: RedBeat scheduler for dynamic schedule management

📖 View Worker Documentation

FTP/SFTP Client (dtpyfw.ftp)

Unified interface for FTP and SFTP operations:

  • Protocol Abstraction: Single API for both FTP and SFTP
  • Context Manager: Automatic connection management and cleanup
  • File Operations: Upload, download, list, delete, rename files
  • Directory Management: Create, remove, and navigate directories
  • Auto-Detection: Automatic protocol detection based on port
  • Timeout Control: Configurable connection timeouts

📖 View FTP Documentation

Encryption & Security (dtpyfw.encrypt)

Authentication and cryptography utilities:

  • JWT Support: Create and validate JSON Web Tokens
  • Multiple Algorithms: HS256, HS384, HS512, RS256, RS384, RS512
  • Password Hashing: bcrypt and argon2 password hashing
  • Token Expiration: Configurable TTL for JWT tokens
  • Custom Claims: Support for custom JWT payload data

📖 View Encryption Documentation


🎯 Quick Start Examples

Building a FastAPI Application

from dtpyfw.api import Application
from dtpyfw.api.routes import Router

# Create routers
router = Router()

@router.get("/health")
async def health_check():
    return {"status": "healthy"}

# Initialize application
app = Application(
    title="My Microservice",
    version="1.0.0",
    routers=[router],
    cors_settings={"allow_origins": ["*"]}
)

# Access the FastAPI app
fastapi_app = app.app

Database Operations

from dtpyfw.db import DatabaseConfig, DatabaseInstance

# Configure database
config = (
    DatabaseConfig()
    .set_db_backend("postgresql")
    .set_db_host("localhost")
    .set_db_port(5432)
    .set_db_name("mydb")
    .set_db_user("user")
    .set_db_password("password")
)

# Create instance
db = DatabaseInstance(config)

# Use context manager for sessions
with db.get_session() as session:
    results = session.execute("SELECT * FROM users").fetchall()

# Async support
async with db.get_async_session() as session:
    result = await session.execute("SELECT * FROM users")

Redis Caching

from dtpyfw.redis.caching import cache_function
from dtpyfw.redis.connection import RedisInstance

# Initialize Redis
redis = RedisInstance(host="localhost", port=6379)

# Cache function results
@cache_function(redis_client=redis.client, expire_time=3600)
def expensive_computation(x: int, y: int) -> int:
    return x ** y

result = expensive_computation(2, 10)  # Computed and cached
result = expensive_computation(2, 10)  # Retrieved from cache

OpenSearch Operations

from dtpyfw.opensearch import OpenSearchConfig, OpenSearchClient

# Configure connection
config = (
    OpenSearchConfig()
    .set_url("https://localhost:9200")
    .set_username("admin")
    .set_password("admin")
    .set_verify_certs(False)
    .set_timeout(30)
)

# Create client
client = OpenSearchClient(config)

# Check health
if client.is_healthy():
    print("OpenSearch cluster is responsive")

# Get cluster info
info = client.get_cluster_info()
print(f"Cluster: {info['cluster_name']}")

# Use underlying client for search operations
opensearch = client.get_client()
result = opensearch.search(
    index="my-index", 
    body={"query": {"match_all": {}}}
)

Celery Task Management

from dtpyfw.worker import Task, Worker
from dtpyfw.redis import RedisInstance

# Register tasks
Task.register("myapp.tasks.process_data", queue="high_priority")
Task.register("myapp.tasks.send_email", queue="low_priority")

# Schedule periodic tasks
Task.schedule_periodic(
    "myapp.tasks.cleanup",
    schedule="cron",
    hour="0",
    minute="0"
)

# Build worker
redis = RedisInstance(host="localhost", port=6379)
worker = Worker.build(task=Task, redis=redis, app_name="my_worker")

S3 File Operations

from dtpyfw.bucket import Bucket

# Initialize bucket
bucket = Bucket(
    name="my-bucket",
    endpoint_url="https://s3.amazonaws.com",
    access_key="ACCESS_KEY",
    secret_key="SECRET_KEY"
)

# Upload file
bucket.upload_file("local_file.txt", "remote/path/file.txt")

# Download file
bucket.download_file("remote/path/file.txt", "downloaded_file.txt")

# Check existence
exists = bucket.file_exists("remote/path/file.txt")

� Module Documentation Reference

Comprehensive documentation for each module is available in the docs/ directory:

  • Core Utilities - Foundational utilities and helpers
  • Logging - Structured logging configuration
  • API - FastAPI application development
  • Database - SQLAlchemy integration and search utilities
  • Bucket - S3-compatible object storage
  • Redis - Redis caching and connection management
  • Redis Streamer - Redis Streams messaging
  • Kafka - Kafka producer and consumer
  • Worker - Celery task management
  • FTP - FTP/SFTP client operations
  • Encryption - JWT and password hashing

🤝 Contributing

We welcome contributions from authorized DealerTower employees and contractors! Please see CONTRIBUTING.md for guidelines on:

  • Development setup and environment configuration
  • Coding standards and style guide (PEP 8, Black formatting)
  • Type annotations and docstring conventions
  • Testing requirements and running tests
  • Pull request process and code review
  • Documentation standards

Development Workflow

# Install dependencies
poetry install -E all

# Run tests
pytest

# Format code
black .

# Type check
mypy dtpyfw

# Run linters
ruff check . --fix

📝 Version History

Current version: 0.6.13

A detailed changelog is maintained in the git commit history. View releases on the GitHub repository.


📄 License

DealerTower Python Framework is proprietary software. See LICENSE for complete terms and conditions.


🔗 Resources

  • Repository: github.com/datgate/dtpyfw
  • Issue Tracker: Report bugs and request features via GitHub Issues
  • Internal Documentation: Additional documentation available on DealerTower's internal wiki

💡 Philosophy

dtpyfw is designed with the following principles:

  • Modularity: Install only what you need
  • Type Safety: Full type annotations for better IDE support and fewer runtime errors
  • Production Ready: Battle-tested patterns and best practices
  • Developer Experience: Clean APIs with consistent interfaces
  • Documentation: Comprehensive docs for every module
  • Standards Compliance: Follows PEP 8, PEP 561, and Python packaging standards

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dtpyfw-0.6.27.tar.gz (126.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dtpyfw-0.6.27-py3-none-any.whl (167.2 kB view details)

Uploaded Python 3

File details

Details for the file dtpyfw-0.6.27.tar.gz.

File metadata

  • Download URL: dtpyfw-0.6.27.tar.gz
  • Upload date:
  • Size: 126.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for dtpyfw-0.6.27.tar.gz
Algorithm Hash digest
SHA256 52fcb6aa9515d4b3506aaa99b7d31e26925c7a82d61ec157f1563fd2ab4ce645
MD5 a88fa342df807c7ff77acd8d71d02357
BLAKE2b-256 8c008fd46c49a3646c710abd13e1e921b6e346c4ee83e84f45a9173ebd9fb788

See more details on using hashes here.

File details

Details for the file dtpyfw-0.6.27-py3-none-any.whl.

File metadata

  • Download URL: dtpyfw-0.6.27-py3-none-any.whl
  • Upload date:
  • Size: 167.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for dtpyfw-0.6.27-py3-none-any.whl
Algorithm Hash digest
SHA256 c10c8cd947ec4884f3ad8c597cf2f307fc20afe6596a81e9159649a7c137fa6a
MD5 65c5fa2cd89708ef42279fbe2cff896f
BLAKE2b-256 d48c2787e2e6e317c9e7a040fea7fd68a0a4b3f6bda28827481568fa7b118677

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page