Skip to main content

Python cloud abstraction SDK for databases and AI services on Azure and AWS

Project description

Agentic Composer SDK

A Python cloud abstraction SDK that provides a unified interface for databases and AI services across Azure and AWS. This SDK allows you to build cloud-agnostic microservices that can easily switch between cloud providers without changing your application code.

Features

๐Ÿ—„๏ธ Database Services

  • Azure: Cosmos DB (NoSQL & Table API), Blob Storage
  • AWS: DynamoDB, S3

๐Ÿค– AI Services

  • Text Analysis: Sentiment analysis, entity recognition, key phrase extraction
  • Chat Completion: GPT models via Azure OpenAI or AWS Bedrock
  • Embeddings: Text-to-vector conversion for similarity search
  • Search: Azure Cognitive Search or AWS OpenSearch

๐Ÿ“จ Messaging Services

  • Azure: Service Bus
  • AWS: SQS

Installation

  1. Install required tools:
python3 -m pip install keyring artifacts-keyring
  1. Configure Azure Artifacts authentication:
keyring set artifacts.dev.azure.com Consulting-DTT-AI-Integration-Services <your-PAT>
  1. Install the package:
python3 -m pip install cloudabstractor
# Install from source
git clone <repository-url>
cd agentic-composer-sdk
pip install -e .

# Or install from PyPI (once published)
pip install agentic-composer-sdk

Quick Start

import asyncio
from agentic_composer_sdk import CloudProvider

async def main():
    # Initialize for Azure
    azure_config = {
        "cloud_provider": "azure",
        "azure_openai_api_key": "your-key",
        "azure_openai_endpoint": "your-endpoint",
        "azure_storage_connection_string": "your-connection-string"
    }

    cloud = CloudProvider(provider_type="azure", config=azure_config)

    # Use AI services
    sentiment = await cloud.analyze_text_sentiment("I love this SDK!")
    print(f"Sentiment: {sentiment}")

    # Store files
    await cloud.store_file("data/document.txt", b"Hello World!")

    # Chat with AI
    messages = [{"role": "user", "content": "What is cloud computing?"}]
    response = await cloud.chat_with_ai(messages)
    print(response['choices'][0]['message']['content'])

asyncio.run(main())

Architecture

The SDK uses an abstract interface pattern to provide cloud-agnostic services:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   Your App      โ”‚    โ”‚   CloudProvider โ”‚
โ”‚                 โ”‚โ—„โ”€โ”€โ–บโ”‚                 โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                                โ”‚
                    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
                    โ”‚           โ”‚           โ”‚
            โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”   โ”Œโ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”   โ”Œโ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”
            โ”‚ Azure    โ”‚   โ”‚ AWS   โ”‚   โ”‚ Future   โ”‚
            โ”‚ Services โ”‚   โ”‚Servicesโ”‚   โ”‚ Providersโ”‚
            โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Configuration

Environment Variables

You can configure the SDK using environment variables:

# Azure Configuration
export CLOUD_SDK_CLOUD_PROVIDER=azure
export CLOUD_SDK_AZURE__AZURE_OPENAI_API_KEY=your-key
export CLOUD_SDK_AZURE__AZURE_OPENAI_ENDPOINT=your-endpoint
export CLOUD_SDK_AZURE__AZURE_STORAGE_CONNECTION_STRING=your-connection-string

# AWS Configuration
export CLOUD_SDK_CLOUD_PROVIDER=aws
export CLOUD_SDK_AWS__AWS_ACCESS_KEY_ID=your-key-id
export CLOUD_SDK_AWS__AWS_SECRET_ACCESS_KEY=your-secret-key
export CLOUD_SDK_AWS__AWS_REGION=us-east-1

Configuration Dictionary

from agentic_composer_sdk import CloudProvider

# Azure configuration
azure_config = {
    "cloud_provider": "azure",
    "azure_storage_connection_string": "...",
    "azure_storage_container_name": "my-container",
    "azure_openai_api_key": "...",
    "azure_openai_endpoint": "...",
    "azure_cognitive_services_endpoint": "...",
    "azure_cognitive_services_key": "...",
    "azure_search_service_name": "...",
    "azure_search_admin_key": "..."
}

# AWS configuration
aws_config = {
    "cloud_provider": "aws",
    "aws_access_key_id": "...",
    "aws_secret_access_key": "...",
    "aws_region": "us-east-1",
    "aws_s3_bucket_name": "my-bucket",
    "aws_opensearch_endpoint": "..."
}

Service Examples

Storage Services

# Upload and download files
storage = cloud.get_storage_service()

# Upload
url = await storage.upload_file("path/to/file.txt", file_data)

# Download
data = await storage.download_file("path/to/file.txt")

# Get presigned URL
signed_url = await storage.get_file_url("path/to/file.txt")

# List files
files = await storage.list_files("path/prefix/")

Database Services

# NoSQL operations
db = cloud.get_db_nosql_service()

# Store document
await db.put_item("table_name", {
    "id": "doc_1",
    "title": "My Document",
    "content": "Document content..."
})

# Retrieve document
doc = await db.get_item("table_name", {"id": "doc_1"})

# Query documents
results = await db.query("table_name", {
    "query": "SELECT * FROM c WHERE c.title = @title",
    "parameters": [{"name": "@title", "value": "My Document"}]
})

AI Services

# Text Analysis
text_analyzer = cloud.get_text_analysis_service()

sentiment = await text_analyzer.analyze_sentiment("I love this product!")
entities = await text_analyzer.recognize_entities("John works at Microsoft")
key_phrases = await text_analyzer.extract_key_phrases("Cloud computing benefits")

# Chat Completion
chat = cloud.get_chat_service()

messages = [
    {"role": "system", "content": "You are a helpful assistant"},
    {"role": "user", "content": "Explain machine learning"}
]
response = await chat.chat_completion(messages, temperature=0.7)

# Streaming chat
async for chunk in chat.stream_chat_completion(messages):
    print(chunk['choices'][0]['delta']['content'], end='')

# Embeddings
embedding_service = cloud.get_embedding_service()

embeddings = await embedding_service.create_embeddings([
    "First document text",
    "Second document text"
])

# Search similar embeddings
similar = await embedding_service.similarity_search(
    query_embedding, all_embeddings, top_k=5
)

Search Services

search = cloud.get_search_service()

# Index document
await search.index_document("my_index", "doc_1", {
    "id": "doc_1",
    "title": "Document Title",
    "content": "Document content...",
    "vector_field": embedding_vector
})

# Text search
results = await search.search_documents("my_index", "search query")

# Vector search
vector_results = await search.vector_search("my_index", query_vector)

Microservice Integration

Here's how to integrate the SDK into a microservice:

from fastapi import FastAPI
from agentic_composer_sdk import CloudProvider

app = FastAPI()

# Initialize cloud provider
cloud = CloudProvider(provider_type="azure", config=config)

@app.post("/process-document")
async def process_document(text: str):
    # Analyze sentiment
    sentiment = await cloud.analyze_text_sentiment(text)

    # Create embeddings
    embeddings = await cloud.create_text_embeddings(text)

    # Store in database
    doc_data = {
        "text": text,
        "sentiment": sentiment,
        "embedding": embeddings[0]
    }

    db = cloud.get_db_nosql_service()
    await db.put_item("documents", doc_data)

    return {"status": "processed", "sentiment": sentiment}

@app.get("/search")
async def search_documents(query: str):
    # Create query embedding
    query_embedding = await cloud.create_text_embeddings(query)

    # Vector search
    search_service = cloud.get_search_service()
    results = await search_service.vector_search(
        "documents", query_embedding[0], top_k=10
    )

    return {"results": results}

Provider-Specific Features

Azure-Specific

# Use specific Azure Cosmos DB API
cosmos_nosql = CosmosDBService.create(config, api_type="nosql")
cosmos_table = CosmosDBService.create(config, api_type="table")

# Azure OpenAI with specific deployment
chat_response = await azure_chat.chat_completion(
    messages,
    model="gpt-35-turbo-16k",  # Your deployment name
    temperature=0.7
)

AWS-Specific

# Use DynamoDB with specific query parameters
results = await dynamodb.query("table", {
    "KeyConditionExpression": Key("pk").eq("value"),
    "FilterExpression": Attr("status").eq("active")
})

# Use Bedrock with specific model
chat_response = await bedrock_chat.chat_completion(
    messages,
    model="anthropic.claude-3-sonnet-20240229-v1:0"
)

Testing

# Install test dependencies
pip install -e ".[test]"

# Run tests
pytest tests/

# Run with coverage
pytest --cov=src tests/

Development

# Install development dependencies
pip install -e ".[dev]"

# Format code
black src/ tests/
isort src/ tests/

# Type checking
mypy src/

# Linting
flake8 src/ tests/

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Run the test suite
  6. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For support and questions:

  • Create an issue on GitHub
  • Check the examples directory for more usage patterns
  • Review the API documentation

Roadmap

  • Google Cloud Platform support
  • Additional AI services (speech, vision)
  • Caching layer
  • Monitoring and observability
  • Rate limiting and retry mechanisms
  • Configuration validation
  • More database providers (MongoDB, Redis)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudkit-0.1.0.tar.gz (17.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cloudkit-0.1.0-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file cloudkit-0.1.0.tar.gz.

File metadata

  • Download URL: cloudkit-0.1.0.tar.gz
  • Upload date:
  • Size: 17.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for cloudkit-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f7b95c4e1603344312a4ba348a1d5e645e47a7ddcf4e649706c3295a35472a70
MD5 911998484c81b43b2dbeabf4ae2d3873
BLAKE2b-256 e81992c45c515bf284392e41721ae77893533826564e1abb071609871e0f10e9

See more details on using hashes here.

File details

Details for the file cloudkit-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: cloudkit-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 4.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for cloudkit-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cde246381eae1b02011eebccb891de4857682b23e30f41cfcafe3d5a9c83bf6d
MD5 6e25f60414b4862c295c24644349a84a
BLAKE2b-256 1fc33210c6062a307fa1caeb85f234311d3e91b25ffabb22f831835975f11549

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page