All the pydantic projects I want to use together.
Project description
pydantic-all-in-one
pip install pydantic-all-in-one
Overview
pydantic-all-in-one is a unified Python framework that seamlessly integrates multiple Pydantic-based libraries to provide a comprehensive solution for building scalable, efficient, and compliant applications. Over five years of development, it has evolved into a robust ecosystem used by Fortune 10 companies to ensure compliance with the EU AI Act through the innovative concept of Service Colonies.
Key features include:
- Intelligent Model Generation with DSLModel: Create dynamic models using templates and AI-assisted code generation.
- EU AI Act Compliance: Implement features to ensure AI applications meet EU regulatory standards.
- Service Colony Architecture: Develop autonomous and cooperative services that adapt and evolve.
- FastAPI Integration: Build high-performance RESTful APIs.
- SQLModel and GQLAlchemy: Manage relational and graph databases seamlessly.
- FastStream and Redis: Implement event-driven architectures.
- aioclock for Scheduling: Schedule and manage asynchronous tasks.
- LanceDB: Handle vector data for machine learning applications.
- Typer for CLI: Create powerful command-line interfaces.
- Extensive Testing and CI/CD: Ensure code quality with comprehensive testing and continuous integration pipelines.
Table of Contents
- pydantic-all-in-one
- Overview
- Table of Contents
- Installation
- Getting Started
- Defining Intelligent Models with DSLModel
- Implementing Service Colonies for EU AI Act Compliance
- Dynamic Class Generation
- Workflow Management and Scheduling
- Event-Driven Architecture with FastStream and Redis
- Graph and Relational Data Management
- Data Handling and Vector Management
- CLI Integration with Typer
- Architecture
- Development
- Contributing
- License
- Contact
Installation
Ensure you have Python 3.12 or higher installed. Then, install pydantic-all-in-one via pip:
pip install pydantic-all-in-one
Alternatively, install from source:
git clone https://github.com/seanchatmangpt/pydantic-all-in-one.git
cd pydantic-all-in-one
poetry install
Getting Started
Defining Intelligent Models with DSLModel
Leverage DSLModel to create dynamic, intelligent models using Jinja2 templates and AI-assisted code generation.
# models.py
from typing import List, Optional
from pydantic import Field
from dslmodel import DSLModel, init_lm
# Initialize language model for AI-assisted generation
init_lm() # Sets the language model to 'gpt-4o-mini'
class ComplianceRequirement(DSLModel):
"""Represents a compliance requirement under the EU AI Act."""
requirement_id: str = Field(..., description="Unique identifier for the requirement.")
description: str = Field(..., description="Description of the compliance requirement.")
risk_level: str = Field(..., description="Risk level associated with the requirement.")
class AIComponent(DSLModel):
"""Represents an AI component within the system."""
component_id: str = Field(..., description="Unique identifier for the AI component.")
functionalities: List[str] = Field(..., description="List of functionalities.")
compliance_status: str = Field(..., description="Compliance status with the EU AI Act.")
Generate models from templates:
# generate_models.py
from models import AIComponent
component_template = """
Component ID: {{ uuid4() }}
Functionalities:
{% for func in functionalities %}
- {{ func }}
{% endfor %}
Compliance Status: Pending
"""
functionalities = ["Data Processing", "Automated Decision Making", "User Interaction"]
# Create an AIComponent instance using the template
ai_component = AIComponent.from_prompt(component_template, functionalities=functionalities)
print(ai_component.json(indent=2))
Implementing Service Colonies for EU AI Act Compliance
Develop a Service Colony architecture where autonomous services (inhabitants) collaborate to ensure compliance with the EU AI Act.
# service_colony.py
from dslmodel import FSMMixin, trigger
from enum import Enum, auto
from typing import List
class ComplianceState(Enum):
INIT = auto()
ASSESSING = auto()
COMPLYING = auto()
MONITORING = auto()
class ComplianceInhabitant(FSMMixin):
def __init__(self, component: AIComponent):
super().__init__()
self.component = component
self.setup_fsm(state_enum=ComplianceState, initial=ComplianceState.INIT)
@trigger(source=ComplianceState.INIT, dest=ComplianceState.ASSESSING)
def assess_risk(self):
print(f"Assessing risk for {self.component.component_id}")
# Perform risk assessment...
@trigger(source=ComplianceState.ASSESSING, dest=ComplianceState.COMPLYING)
def implement_compliance(self):
print(f"Implementing compliance measures for {self.component.component_id}")
# Implement compliance...
@trigger(source=ComplianceState.COMPLYING, dest=ComplianceState.MONITORING)
def start_monitoring(self):
print(f"Starting monitoring for {self.component.component_id}")
# Start monitoring...
def forward(self, event: str):
super().forward(event)
print(f"Processing event: {event}")
Instantiate inhabitants and simulate compliance workflow:
# main.py
from service_colony import ComplianceInhabitant
from models import AIComponent
component = AIComponent(
component_id="comp-123",
functionalities=["Automated Decision Making"],
compliance_status="Pending"
)
inhabitant = ComplianceInhabitant(component)
inhabitant.assess_risk()
inhabitant.implement_compliance()
inhabitant.start_monitoring()
Dynamic Class Generation
Use DSLClassGenerator for dynamic class creation based on prompts.
# class_generator.py
from dslmodel.generators.gen_models import DSLClassGenerator
from pathlib import Path
# Generate a new Pydantic model class from a prompt
prompt = """
Create a Pydantic model class named 'RiskAssessment' with fields:
- assessment_id: str
- component_id: str
- risk_level: str
- findings: List[str]
- recommendations: List[str]
"""
generator = DSLClassGenerator(
model_prompt=prompt,
file_path=Path('./generated_models.py'),
append=True
)
generator()
Generated class (generated_models.py
):
from typing import List
from pydantic import BaseModel
class RiskAssessment(BaseModel):
assessment_id: str
component_id: str
risk_level: str
findings: List[str]
recommendations: List[str]
Workflow Management and Scheduling
Define and execute complex workflows using Workflow, Job, and Action, and schedule tasks with aioclock.
# workflow.py
from dslmodel.workflow import Workflow, Job, Action, Condition, CronSchedule
from aioclock import AioClock, Every
# Define actions
action_assess = Action(
name="Assess Risk",
code="inhabitant.assess_risk()"
)
action_implement = Action(
name="Implement Compliance",
code="inhabitant.implement_compliance()"
)
action_monitor = Action(
name="Start Monitoring",
code="inhabitant.start_monitoring()"
)
# Define job
job = Job(
name="Compliance Workflow",
steps=[action_assess, action_implement, action_monitor]
)
# Define workflow
workflow = Workflow(
name="EU AI Act Compliance Workflow",
jobs=[job],
context={"inhabitant": inhabitant}
)
# Schedule workflow execution
clock = AioClock()
@clock.task(trigger=Every(hours=24))
async def scheduled_workflow():
workflow.execute()
Event-Driven Architecture with FastStream and Redis
Implement an event-driven architecture using FastStream and Redis.
# event_stream.py
from faststream import FastStream
from faststream.redis import RedisBroker
from models import ComplianceRequirement
broker = RedisBroker("redis://localhost:6379")
app = FastStream(broker)
@app.subscriber("compliance/requirements")
async def handle_requirement(data: ComplianceRequirement):
print(f"Received compliance requirement: {data.requirement_id}")
# Process requirement...
@app.publisher("compliance/status")
async def publish_status(status: dict):
await broker.publish("compliance/status", status)
Graph and Relational Data Management
Manage graph data with GQLAlchemy and relational data with SQLModel.
# data_management.py
from gqlalchemy import Memgraph, Node
from sqlmodel import SQLModel, Field, create_engine, Session
# Memgraph setup
memgraph = Memgraph()
class ComponentNode(Node):
component_id: str
compliance_status: str
# SQLModel setup
class ComplianceRecord(SQLModel, table=True):
id: int = Field(default=None, primary_key=True)
component_id: str
status: str
engine = create_engine("sqlite:///compliance.db")
SQLModel.metadata.create_all(engine)
Data Handling and Vector Management
Use DataReader and DataWriter for data handling, and LanceDB for vector data management.
# data_io.py
from dslmodel import DataReader, DataWriter
from lancedb.pydantic import pydantic_to_schema
from models import ComplianceRequirement
# Reading data
data_reader = DataReader(file_path="data/requirements.csv")
requirements = data_reader.forward()
# Writing data
data_writer = DataWriter(data=requirements, file_path="output/processed_requirements.csv")
data_writer.forward()
# LanceDB schema
requirement_schema = pydantic_to_schema(ComplianceRequirement)
CLI Integration with Typer
Create powerful command-line interfaces using Typer.
# cli.py
import typer
app = typer.Typer()
@app.command()
def assess(component_id: str):
"""Assess compliance for a component."""
# Perform assessment...
typer.echo(f"Compliance assessment started for {component_id}")
@app.command()
def status(component_id: str):
"""Check compliance status of a component."""
# Check status...
typer.echo(f"Compliance status for {component_id}: Compliant")
if __name__ == "__main__":
app()
Architecture
Core Components
- DSLModel: Core framework for intelligent model creation using templates and AI assistance.
- Service Colonies: Architectural style for developing autonomous and cooperative services.
- FSMMixin: Provides finite state machine functionality.
- Workflow Components:
Workflow
,Job
,Action
,Condition
,CronSchedule
for orchestrating workflows. - Data Handling Utilities:
DataReader
,DataWriter
for data ingestion and output. - Database Management: SQLModel and GQLAlchemy for relational and graph databases.
- Event Streaming: FastStream and RedisBroker for event-driven architectures.
- Scheduling: aioclock for scheduling asynchronous tasks.
- Vector Management: LanceDB for handling vector data.
Service Colony Architecture
The Service Colony consists of inhabitants (services) that:
- Autonomously adapt to changes.
- Collaborate to fulfill global objectives.
- Ensure compliance with regulations like the EU AI Act.
- Evolve over time through dynamic class generation and AI assistance.
Data Flow
User Inputs -> DSLModel Templates -> Generated Models -> Inhabitants (Services)
|
v
Event Streams (FastStream) <-> Inhabitants
|
v
Data Storage (SQLModel, GQLAlchemy, LanceDB)
|
v
Monitoring and Compliance Reporting
Development
Setup
-
Clone the Repository
git clone https://github.com/seanchatmangpt/pydantic-all-in-one.git cd pydantic-all-in-one
-
Install Dependencies
poetry install
-
Configure Environment Variables
Create a
.env
file and add necessary environment variables, such asREDIS_URL
andMEMGRAPH_URL
. -
Start Docker Services
docker-compose up -d
-
Run the Application
poetry run poe api
Testing
Run tests using pytest
:
poetry run pytest
Ensure test coverage is at least 90%.
Continuous Integration and Deployment
pydantic-all-in-one utilizes GitHub Actions for CI/CD:
- Code Push: Triggers automated testing and linting.
- Testing: Runs unit and integration tests.
- Linting: Uses
ruff
for code quality checks. - Deployment: Deploys successful builds to staging or production environments.
Contributing
Contributions are welcome! Please follow the contribution guidelines and adhere to the code of conduct.
-
Fork the Repository
-
Create a Feature Branch
git checkout -b feature/YourFeature
-
Commit Your Changes
-
Push to Your Fork
git push origin feature/YourFeature
-
Open a Pull Request
License
Distributed under the MIT License. See LICENSE for more information.
Contact
- Project Link: https://github.com/seanchatmangpt/pydantic-all-in-one
- Issues: https://github.com/seanchatmangpt/pydantic-all-in-one/issues
By following this guide, you can effectively utilize pydantic-all-in-one to build scalable, efficient, and compliant applications. The integration of DSLModel provides intelligence and initiative, enabling the development of dynamic models and services that adapt over time. The Service Colony architecture fosters collaboration among autonomous services, ensuring compliance with regulations like the EU AI Act.
This comprehensive README illustrates how all the components and classes work together, providing a cohesive narrative and practical examples to get you started.
Note: The project structure reflects five years of development, incorporating advanced features and compliance mechanisms used by Fortune 10 companies. The examples demonstrate how to build a production-ready system that leverages modern Python frameworks and AI assistance to meet stringent regulatory requirements.
Happy coding!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pyd4all-2024.11.2.tar.gz
.
File metadata
- Download URL: pyd4all-2024.11.2.tar.gz
- Upload date:
- Size: 110.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 333ca79380d81de59c0b90dd9a243a5294a95bc4f906760278962759bf7d3c4b |
|
MD5 | 5e39b25e16f5decdb81e70703c36ed92 |
|
BLAKE2b-256 | 8e03855736208355e691a01cf700714bb093456c556da1591155ea40c7255be1 |
File details
Details for the file pyd4all-2024.11.2-py3-none-any.whl
.
File metadata
- Download URL: pyd4all-2024.11.2-py3-none-any.whl
- Upload date:
- Size: 115.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e70f6cec0b5c89d4154d31a3a50d02c73043def1303e70ede8b2054d59a6c074 |
|
MD5 | dd39b995860057d185121265e7374451 |
|
BLAKE2b-256 | 596e507084435f8e786aadce05b0461a1a354a5337160dfe52a5cac14c5b5e57 |