Skip to main content

A simple AI toolkit for text processing using OpenAI and Gemini APIs

Project description

AIWand 🪄

The simplest way to unify OpenAI and Gemini APIs - Drop-in replacement for your existing AI code with automatic provider switching and structured output handling.

PyPI version Python versions License

🎯 Simple Migration - One Line Change

Before - Direct API calls with provider-specific code:

# OpenAI specific code
content = openai_client.chat.completions.create(
    model="gpt-4o",
    messages=messages,
    temperature=0.8,
    top_p=0.9,
    response_format={"type": "json_object"}
)
result = json.loads(content.choices[0].message.content)  # Manual parsing

# OR Gemini specific code
content = gemini_client.chat.completions.create(
    model="gemini-2.0-flash",
    messages=messages,
    temperature=0.8,
    top_p=0.9,
    response_format=SomeSchema
)
result = content.parsed  # Different response handling

After - Unified AIWand code that works with both:

import aiwand

# Same code works with OpenAI, Gemini, and their structured outputs!
content = aiwand.make_ai_request(
    model="gpt-4o",          # or "gemini-2.0-flash" 
    messages=messages,
    temperature=0.8,
    top_p=0.9,
    response_format=CarouselContent  # Pydantic model - automatic parsing!
)
# 'content' is already your parsed Pydantic object - no post-processing needed! ✨

Why AIWand?

  • 🔄 Drop-in Replacement - Minimal code changes, maximum benefits
  • 🧠 Smart Provider Detection - Automatically uses OpenAI or Gemini based on model name
  • 🏗️ Structured Output Magic - Handles Pydantic models automatically for both providers
  • No Post-Processing - Get parsed objects directly, skip manual JSON handling
  • 🎯 Unified API - Same code works across different AI providers
  • 🔑 Zero Configuration - Works with just environment variables
  • 📱 High-Level Functions - Built-in summarization, chat, and text generation

🚀 Quick Start

Installation

pip install aiwand

Configuration

Set your API keys as environment variables:

# Option 1: OpenAI only
export OPENAI_API_KEY="your-openai-key"

# Option 2: Gemini only  
export GEMINI_API_KEY="your-gemini-key"

# Option 3: Both (set preference)
export OPENAI_API_KEY="your-openai-key"
export GEMINI_API_KEY="your-gemini-key"
export AI_DEFAULT_PROVIDER="openai"  # or "gemini"

Or create a .env file in your project:

OPENAI_API_KEY=your-openai-key
GEMINI_API_KEY=your-gemini-key
AI_DEFAULT_PROVIDER=openai

Core AI Functionality

The make_ai_request() function is the heart of AIWand - a unified interface for all AI providers:

import aiwand
from pydantic import BaseModel

# Basic text generation
response = aiwand.make_ai_request(
    messages=[{"role": "user", "content": "Explain quantum computing"}],
    model="gpt-4o"  # Automatically uses OpenAI
)

# Switch providers seamlessly
response = aiwand.make_ai_request(
    messages=[{"role": "user", "content": "Explain quantum computing"}],
    model="gemini-2.0-flash"  # Automatically uses Gemini
)

# Structured output with Pydantic models
class BlogPost(BaseModel):
    title: str
    content: str
    tags: list[str]

blog_post = aiwand.make_ai_request(
    messages=[{"role": "user", "content": "Write a blog post about AI"}],
    model="gpt-4o",
    response_format=BlogPost  # Returns parsed BlogPost object!
)
print(blog_post.title)  # Direct access to structured data

# Custom/preview models with explicit provider
response = aiwand.make_ai_request(
    model="gemini-2.5-flash-preview-05-20",  # New model not in our registry
    provider="gemini",  # Explicit provider specification
    messages=[{"role": "user", "content": "Hello from the future!"}]
)

# Advanced parameters
response = aiwand.make_ai_request(
    messages=[
        {"role": "system", "content": "You are a helpful coding assistant"},
        {"role": "user", "content": "Write a Python function to sort a list"}
    ],
    model="gpt-4o",
    temperature=0.3,  # More focused
    max_tokens=500,
    top_p=0.9
)

High-Level Convenience Functions

For common tasks, use these simplified functions:

import aiwand

# Text summarization
summary = aiwand.summarize("Your long text here...")

# AI chat with conversation history
response = aiwand.chat("What is machine learning?")

# Text generation from prompts
story = aiwand.generate_text("Write a poem about coding")

# Customized summarization
summary = aiwand.summarize(
    text="Your long text...",
    style="bullet-points",  # "concise", "detailed", "bullet-points"
    max_length=50,
    model="gpt-4o"  # Optional: specify model
)

# Chat with conversation history
conversation = []
response1 = aiwand.chat("Hello!", conversation_history=conversation)
conversation.append({"role": "user", "content": "Hello!"})
conversation.append({"role": "assistant", "content": response1})

response2 = aiwand.chat("What did I just say?", conversation_history=conversation)

# Text generation with custom parameters
text = aiwand.generate_text(
    prompt="Write a technical explanation",
    max_tokens=300,
    temperature=0.3  # Lower = more focused, Higher = more creative
)

# Helper utilities for testing and development
random_num = aiwand.generate_random_number(8)  # 8-digit number
unique_id = aiwand.generate_uuid()  # UUID4

🎯 Smart Provider Features

Automatic Model Detection

# AIWand automatically detects the right provider:
response = aiwand.make_ai_request(model="gpt-4o", ...)        # → OpenAI
response = aiwand.make_ai_request(model="gemini-2.0-flash", ...)  # → Gemini
response = aiwand.make_ai_request(model="o3-mini", ...)       # → OpenAI

# Pattern-based detection for unknown models:
response = aiwand.make_ai_request(model="gemini-experimental-123", ...)  # → Gemini

Explicit Provider Control

# Force a specific provider for custom models:
response = aiwand.make_ai_request(
    model="my-custom-model",
    provider="gemini",  # or AIProvider.GEMINI
    messages=[...]
)

# Works with both string and enum:
from aiwand import AIProvider
response = aiwand.make_ai_request(
    model="any-model",
    provider=AIProvider.OPENAI,
    messages=[...]
)

Structured Output Support

from pydantic import BaseModel

class ProductReview(BaseModel):
    rating: int
    pros: list[str]
    cons: list[str]
    recommendation: bool

# Works identically with both providers:
review = aiwand.make_ai_request(
    model="gpt-4o",  # or "gemini-2.0-flash"
    messages=[{"role": "user", "content": "Review this product: ..."}],
    response_format=ProductReview
)
# No manual JSON parsing needed - returns ProductReview object directly!

Configuration Management

import aiwand

# Show current configuration
aiwand.show_current_config()

# Interactive setup (optional)
aiwand.setup_user_preferences()

Error Handling

import aiwand

try:
    summary = aiwand.summarize("Some text")
except aiwand.AIError as e:
    print(f"AI service error: {e}")
except ValueError as e:
    print(f"Input error: {e}")

🔧 CLI Usage (Optional)

# Direct prompts (easiest way!)
aiwand "Ten fun names for a pet pelican"
aiwand "Explain quantum computing in simple terms" 

# Specific commands
aiwand summarize "Your text here" --style bullet-points
aiwand chat "What is machine learning?"
aiwand generate "Write a story about AI"

# Helper utilities
aiwand helper random --length 8        # Generate 8-digit random number
aiwand helper uuid --uppercase         # Generate uppercase UUID

# Setup preferences
aiwand setup
aiwand config

📚 Documentation

🛠️ Contributing

We welcome contributions from both AI assistants and human developers! Please see our comprehensive contributing guide:

Whether you're an AI assistant helping users or a human developer, these guides ensure consistency and quality across all contributions.

🤝 Connect

📝 License

MIT License - see LICENSE file for details.


Made with ❤️ by Aman Kumar

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiwand-0.4.6.tar.gz (47.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aiwand-0.4.6-py3-none-any.whl (24.3 kB view details)

Uploaded Python 3

File details

Details for the file aiwand-0.4.6.tar.gz.

File metadata

  • Download URL: aiwand-0.4.6.tar.gz
  • Upload date:
  • Size: 47.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.4

File hashes

Hashes for aiwand-0.4.6.tar.gz
Algorithm Hash digest
SHA256 72807df839c9f771bf18680cc748b956c81085e5e8ac24eb868b7f5cfd5b0148
MD5 25ddb3ea5d84b4c9e410dd37d5d1fdd0
BLAKE2b-256 faddce3873af62f4d01cd1983b226db3561991ba3db75797f6dcd7142fda1cc8

See more details on using hashes here.

File details

Details for the file aiwand-0.4.6-py3-none-any.whl.

File metadata

  • Download URL: aiwand-0.4.6-py3-none-any.whl
  • Upload date:
  • Size: 24.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.4

File hashes

Hashes for aiwand-0.4.6-py3-none-any.whl
Algorithm Hash digest
SHA256 9dbbd0f081eb129c26972853a66347d9d165510b89a5eb8954ad68d6b78e9fde
MD5 c1c8a48b192bcbfbea964d15c2b864f5
BLAKE2b-256 4fdb737340312df30433ce4787b320eac53172374de729aa7d6bfebcf5442de2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page