Skip to main content

A flexible, multi-LLM, agent-oriented Python library with self-reflection capabilities

Project description

shaheenai

ShaheenAI is a flexible, multi-LLM, agent-oriented Python library that supports multiple language model providers like OpenAI, Anthropic, Ollama, and Cohere via a plugin/extras architecture. The library offers self-reflection, tool invocation, task chaining, research planning capabilities, and optional UI integrations using Streamlit and Chainlit.

Features

  • Modular Agent Class: Supports multiple LLMs with self-reflection, tool invocation, and task chaining.
  • Research Planning & Management: Comprehensive tools for research project management, milestones, and bibliography.
  • Advanced Coding Assistant: Specialized support for Python and multi-language programming assistance.
  • Real API Integration: Built-in tools for weather (OpenWeatherMap), web search (Brave/SerpAPI/DuckDuckGo), and calculations.
  • Identity Awareness: Agents respond with "I am Shaheen AI developed by Engr. Hamza" when asked about identity.
  • MCP Server Interface: Tool integration via Model Context Protocol for extensibility.
  • Configurable via YAML or Code: Supports playbooks and programmatic configuration.
  • Wide LLM Provider Support: OpenAI, Anthropic, Ollama, Cohere, Google Gemini, etc.
  • Memory & Self-Reflection: Conversation context tracking and response improvement capabilities.
  • Async/Sync Operations: Both synchronous and asynchronous agent operations supported.
  • Streamlit and Chainlit Support: Build interactive and conversational UIs for agents.

Getting Started

Prerequisites

  • Python 3.10 or higher

Installation

To install ShaheenAI, use pip:

pip install shaheenai

Usage

1. Basic Agent Creation

from shaheenai import Agent

# Create a simple agent
agent = Agent(
    instructions="You are a helpful AI assistant specializing in Python programming.",
    llm="openai/gpt-3.5-turbo"
)

# Ask a question
response = agent.start("Explain list comprehensions in Python")
print(response)

2. Agent with Memory

from shaheenai import Agent

# Create an agent with conversation memory
agent = Agent(
    instructions="You are a knowledgeable tutor.",
    llm="openai/gpt-4",
    memory=True
)

# Have a conversation
print(agent.start("What is machine learning?"))
print(agent.start("Can you give me an example?"))  # Remembers previous context
print(agent.start("How does it relate to AI?"))   # Continues the conversation

3. Agent with Self-Reflection

from shaheenai import Agent

# Create an agent with self-reflection capabilities
agent = Agent(
    instructions="You are a research assistant that provides accurate information.",
    llm="anthropic/claude-3-sonnet",
    self_reflection=True,
    max_iterations=2
)

# The agent will reflect on and improve its initial response
response = agent.start("Explain quantum computing and its potential applications")
print(response)

4. Multi-LLM Provider Support

from shaheenai import Agent

# Different LLM providers
openai_agent = Agent(llm="openai/gpt-4")
anthropic_agent = Agent(llm="anthropic/claude-3-opus")
cohere_agent = Agent(llm="cohere/command-r-plus")
ollama_agent = Agent(llm="ollama/llama2")

# Use any agent
response = openai_agent.start("Hello, how are you?")
print(response)

5. Agent Identity Feature

from shaheenai import Agent

agent = Agent()

# Ask about identity
print(agent.start("Who are you?"))
# Output: "I am Shaheen AI developed by Engr. Hamza, an enthusiastic AI engineer."

print(agent.start("Who developed you?"))
# Output: "I am Shaheen AI developed by Engr. Hamza, an enthusiastic AI engineer."

6. Async Operations

import asyncio
from shaheenai import Agent

async def main():
    agent = Agent(
        instructions="You are a helpful assistant.",
        llm="openai/gpt-3.5-turbo"
    )
    
    # Use async method
    response = await agent.astart("What are the benefits of async programming?")
    print(response)

# Run async
asyncio.run(main())

7. Using with Tools (MCP)

from shaheenai import Agent, MCP

# Define and run the MCP server
mcp = MCP()

@mcp.tool()
async def get_weather(location: str) -e str:
    """Get weather information for a location using the OpenWeatherMap API"""
    return "Weather information is now retrieved using OpenWeatherMap API."

@mcp.tool()
async def web_search(query: str, max_results: int = 5) -e str:
    """Search the internet for information using real search APIs"""
    return "Search results are now retrieved using Brave, SerpAPI, or DuckDuckGo."

@mcp.tool()
async def calculate_tip(bill_amount: float, tip_percentage: float = 15.0) -e str:
    """Calculate tip amount"""
    tip = bill_amount * (tip_percentage / 100)
    total = bill_amount + tip
    return f"Bill: ${bill_amount:.2f}, Tip ({tip_percentage}%): ${tip:.2f}, Total: ${total:.2f}"

# Create an agent with tools
agent = Agent(
    instructions="You can use tools to help users with weather, calculations, and web searches.",
    llm="openai/gpt-3.5-turbo",
    tools=["get_weather", "web_search", "calculate_tip"]
)

# Use the agent
response = agent.start("What's the weather in Tokyo?")
print(response)

response = agent.start("Search for Python programming tutorials")
print(response)

response = agent.start("Calculate tip for a $50 bill")
print(response)

API Configuration for Real Tools

ShaheenAI includes several built-in tools that require API keys for full functionality:

Weather Tool (get_weather)

Uses OpenWeatherMap API for real weather data:

# Windows PowerShell
$env:OPENWEATHER_API_KEY='your-openweathermap-api-key'

# Linux/Mac
export OPENWEATHER_API_KEY='your-openweathermap-api-key'
  • Get your API key from: OpenWeatherMap API
  • Features: Current weather, temperature, humidity, wind, pressure, visibility

Web Search Tool (web_search)

Supports multiple search providers (tries in order):

Option 1: Brave Search API (Recommended)

# Windows PowerShell
$env:BRAVE_API_KEY='your-brave-search-api-key'

# Linux/Mac
export BRAVE_API_KEY='your-brave-search-api-key'

Option 2: SerpAPI (Google Search)

# Windows PowerShell
$env:SERPAPI_KEY='your-serpapi-key'

# Linux/Mac
export SERPAPI_KEY='your-serpapi-key'

Option 3: DuckDuckGo (Free, No API Key)

  • Automatically used as fallback if no API keys are configured
  • Limited to instant answers and definitions

Example with Real APIs

import os
from shaheenai import Agent

# Set up API keys
os.environ['OPENWEATHER_API_KEY'] = 'your-openweathermap-key'
os.environ['BRAVE_API_KEY'] = 'your-brave-search-key'

# Create agent with real tools
agent = Agent(
    instructions="I can help with weather, web searches, and calculations using real APIs.",
    llm="openai/gpt-3.5-turbo",
    tools=["get_weather", "web_search", "calculate"]
)

# Real weather data
weather = agent.start("What's the weather in New York?")
print(weather)
# Output: Detailed weather report with temperature, humidity, wind, etc.

# Real web search
search = agent.start("Search for latest AI developments")
print(search)
# Output: Real search results from Brave/Google/DuckDuckGo

# Built-in calculator
math = agent.start("Calculate 25 * 4 + 18")
print(math)
# Output: 118

CLI

ShaheenAI provides a command-line interface for running agents defined in YAML playbooks or via auto-mode.

Example:

shaheenai run agents.yaml

Built-in Tools

ShaheenAI comes with several built-in tools that work with real APIs:

๐ŸŒค๏ธ Weather Tool

  • Function: get_weather(location)
  • API: OpenWeatherMap
  • Features: Temperature, humidity, wind, pressure, visibility
  • Usage: "What's the weather in London?"

๐Ÿ” Web Search Tool

  • Function: web_search(query, max_results=5)
  • APIs: Brave Search, SerpAPI (Google), DuckDuckGo
  • Features: Real-time web search results
  • Usage: "Search for Python tutorials"

๐Ÿงฎ Calculator Tool

  • Function: calculate(expression)
  • Features: Mathematical expression evaluation
  • Usage: "Calculate 25 * 4 + 18"

Research Planning & Management

ShaheenAI includes comprehensive research planning and management capabilities:

๐Ÿ“‹ Research Project Management

from shaheenai.research import ResearchProject
from datetime import datetime

# Create a research project
project = ResearchProject(
    name="AI Code Generation Study",
    description="Research on automatic code generation using LLMs",
    start_date=datetime(2024, 2, 1),
    end_date=datetime(2024, 8, 31)
)

# Add milestones
project.add_milestone(
    "Literature Review",
    "Comprehensive review of existing techniques",
    datetime(2024, 3, 15)
)

# Track progress
project.complete_milestone("Literature Review")
print(f"Progress: {project.get_progress():.1f}%")

# Generate project report
report = project.generate_report()
print(report)

๐Ÿ“š Bibliography Management

from shaheenai.research import BibliographyManager

# Create bibliography manager
bib_manager = BibliographyManager()

# Add research papers
bib_manager.add_entry({
    "type": "article",
    "key": "chen2021evaluating",
    "title": "Evaluating Large Language Models Trained on Code",
    "author": "Chen, Mark and others",
    "year": "2021"
})

# Export to BibTeX
bib_manager.export_bibtex("references.bib")

๐Ÿ“„ Research Templates

from shaheenai.research import ResearchTemplates

# Generate research proposal template
proposal = ResearchTemplates.generate_template("proposal")
print(proposal)

# Generate research report template
report = ResearchTemplates.generate_template("report")
print(report)

๐Ÿ—‚๏ธ Research Planning

from shaheenai.research import ResearchPlanner

# Create research planner
planner = ResearchPlanner()

# Add research tasks
planner.add_task("Literature review")
planner.add_task("Data collection")
planner.add_task("Methodology design")

# Set deadlines
planner.set_timeline("Literature review", "2024-03-15")

# View planned tasks
tasks = planner.view_tasks()
print(f"Tasks: {tasks}")

Directory Structure

shaheenai/
 โ”œโ”€โ”€ shaheenai/
 โ”‚    โ”œโ”€โ”€ __init__.py
 โ”‚    โ”œโ”€โ”€ agent.py          # Main Agent class with tool integration
 โ”‚    โ”œโ”€โ”€ mcp.py            # MCP server and built-in tools
 โ”‚    โ”œโ”€โ”€ llm_providers/    # LLM provider implementations
 โ”‚    โ”‚     โ”œโ”€โ”€ openai.py
 โ”‚    โ”‚     โ”œโ”€โ”€ google.py   # Google Gemini
 โ”‚    โ”‚     โ”œโ”€โ”€ cohere.py
 โ”‚    โ”‚     โ””โ”€โ”€ ...
 โ”‚    โ”œโ”€โ”€ tools/            # Tool base classes
 โ”‚    โ”œโ”€โ”€ ui/               # UI integrations
 โ”‚    โ”‚     โ”œโ”€โ”€ streamlit_ui.py
 โ”‚    โ”‚     โ””โ”€โ”€ chainlit_ui.py
 โ”‚    โ””โ”€โ”€ config.py
 โ”œโ”€โ”€ pyproject.toml
 โ”œโ”€โ”€ README.md
 โ”œโ”€โ”€ LICENSE
 โ””โ”€โ”€ examples/
      โ”œโ”€โ”€ comprehensive_test.py
      โ”œโ”€โ”€ test_chainlit_app.py
      โ””โ”€โ”€ agents.yaml

Contributing

Contributions are welcome! Please read the contribution guidelines first.

License

This project is licensed under the MIT License.

Acknowledgments

Inspired by PraisonAI for its modularity and multi-LLM support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shaheenai-0.2.0.tar.gz (42.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

shaheenai-0.2.0-py3-none-any.whl (40.6 kB view details)

Uploaded Python 3

File details

Details for the file shaheenai-0.2.0.tar.gz.

File metadata

  • Download URL: shaheenai-0.2.0.tar.gz
  • Upload date:
  • Size: 42.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for shaheenai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 3593939ef41b879261f6bc5b136fe1659b37a818982c71a05b4a8cc9ca873ee9
MD5 5a5029596e9a0e5897ccf75efcf306a9
BLAKE2b-256 422a52638c816996fc602f3124662b7c91882f70ff4c5167a74a3e97267d86c3

See more details on using hashes here.

File details

Details for the file shaheenai-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: shaheenai-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 40.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for shaheenai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 df6ae50189baa245d52d3323a6d03b3b85c2dd43d44a821ca7efb6dad2462be2
MD5 a0bbe2de56eaa9fccbb406ec98e65be5
BLAKE2b-256 cae8b127bcfa1f14af4f308051e35e47d74b6bd1fef3cc5530ec23ea97502120

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page