Skip to main content

A dynamic and flexible AI agent framework for building intelligent, multi-modal AI agents

Project description

GRAMI-AI: Dynamic AI Agent Framework

Version Python Versions License GitHub Stars

Overview

GRAMI-AI is a cutting-edge, async-first AI agent framework designed to solve complex computational challenges through intelligent, collaborative agent interactions. Built with unprecedented flexibility, this library empowers developers to create sophisticated, context-aware AI systems that can adapt, learn, and collaborate across diverse domains.

Key Features

  • Dynamic AI Agent Creation
  • Multi-LLM Support (Gemini, OpenAI, Anthropic, Ollama)
  • Extensible Tool Ecosystem
  • Multiple Communication Interfaces
  • Flexible Memory Management
  • Secure and Scalable Architecture

Installation

Using pip

pip install grami-ai

From Source

git clone https://github.com/YAFATEK/grami-ai.git
cd grami-ai
pip install -e .

Quick Start

Basic Agent Creation

from grami.agent import Agent
from grami.providers import GeminiProvider

# Initialize a Gemini-powered Agent
agent = Agent(
    name="AssistantAI",
    role="Helpful Digital Assistant",
    llm_provider=GeminiProvider(api_key="YOUR_API_KEY"),
    tools=[WebSearchTool(), CalculatorTool()]
)

# Send a message
response = await agent.send_message("Help me plan a trip to Paris")
print(response)

Examples

We provide a variety of example implementations to help you get started:

Basic Agents

  • examples/simple_agent_example.py: Basic mathematical calculation agent
  • examples/gemini_example.py: Multi-tool Gemini Agent with various capabilities

Advanced Scenarios

  • examples/content_creation_agent.py: AI-Powered Content Creation Agent

    • Generates blog posts
    • Conducts topic research
    • Creates supporting visuals
    • Tailors content to specific audiences
  • examples/web_research_agent.py: Advanced Web Research and Trend Analysis Agent

    • Performs comprehensive market research
    • Conducts web searches
    • Analyzes sentiment
    • Predicts industry trends
    • Generates detailed reports

Collaborative Agents

  • examples/agent_crew_example.py: Multi-Agent Collaboration
    • Demonstrates inter-agent communication
    • Showcases specialized agent roles
    • Enables complex task delegation

Tool Integration

  • examples/tools.py: Collection of custom tools
    • Web Search
    • Weather Information
    • Calculator
    • Sentiment Analysis
    • Image Generation

Environment Variables

API Key Management

GRAMI-AI uses environment variables to manage sensitive credentials securely. To set up your API keys:

  1. Create a .env file in the project root directory
  2. Add your API keys in the following format:
    GEMINI_API_KEY=your_gemini_api_key_here
    

Important: Never commit your .env file to version control. The .gitignore is already configured to prevent this.

Development Checklist

Core Framework Design

  • Implement AsyncAgent base class with dynamic configuration
  • Create flexible system instruction definition mechanism
  • Design abstract LLM provider interface
  • Develop dynamic role and persona assignment system
  • Implement multi-modal agent capabilities (text, image, video)

LLM Provider Abstraction

  • Unified interface for diverse LLM providers
    • Google Gemini integration (start_chat(), send_message())
    • OpenAI ChatGPT integration
    • Anthropic Claude integration
    • Ollama local LLM support
  • Standardize function/tool calling across providers
  • Dynamic prompt engineering support
  • Provider-specific configuration handling

Communication Interfaces

  • WebSocket real-time communication
  • REST API endpoint design
  • Kafka inter-agent communication
  • gRPC support
  • Event-driven agent notification system
  • Secure communication protocols

Memory and State Management

  • Pluggable memory providers
    • In-memory state storage
    • Redis distributed memory
    • DynamoDB scalable storage
    • S3 content storage
  • Conversation and task history tracking
  • Global state management for agent crews
  • Persistent task and interaction logs

Tool and Function Ecosystem

  • Extensible tool integration framework
  • Default utility tools
    • Kafka message publisher
    • Web search utility
    • Content analysis tool
  • Provider-specific function calling support
  • Community tool marketplace
  • Easy custom tool development

Agent Crew Collaboration

  • Inter-agent communication protocol
  • Workflow and task delegation mechanisms
  • Approval and review workflows
  • Notification and escalation systems
  • Dynamic team composition
  • Shared context and memory management

Use Case Implementations

  • Digital Agency workflow template
    • Growth Manager agent
    • Content Creator agent
    • Trend Researcher agent
    • Media Creation agent
  • Customer interaction management
  • Approval and revision cycles

Security and Compliance

  • Secure credential management
  • Role-based access control
  • Audit logging
  • Compliance with data protection regulations

Performance and Scalability

  • Async-first design
  • Horizontal scaling support
  • Performance benchmarking
  • Resource optimization

Testing and Quality

  • Comprehensive unit testing
  • Integration testing for agent interactions
  • Mocking frameworks for LLM providers
  • Continuous integration setup

Documentation and Community

  • Detailed API documentation
  • Comprehensive developer guides
  • Example use case implementations
  • Contribution guidelines
  • Community tool submission process
  • Regular maintenance and updates

Future Roadmap

  • Payment integration solutions
  • Advanced context understanding
  • Multi-language support
  • Enterprise-grade security features
  • AI agent marketplace

Documentation

For detailed documentation, visit our Documentation Website

Contributing

We welcome contributions! Please see our Contribution Guidelines

License

MIT License - Empowering open-source innovation

About YAFATEK Solutions

Pioneering AI innovation through flexible, powerful frameworks.

Contact & Support


Star ⭐ the project if you believe in collaborative AI innovation!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grami_ai-0.3.120.tar.gz (10.7 kB view details)

Uploaded Source

Built Distribution

grami_ai-0.3.120-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file grami_ai-0.3.120.tar.gz.

File metadata

  • Download URL: grami_ai-0.3.120.tar.gz
  • Upload date:
  • Size: 10.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for grami_ai-0.3.120.tar.gz
Algorithm Hash digest
SHA256 823c06d00414470fb5fe2651bd6545d4a9d1595182dfe8da95b461d27310dfef
MD5 55980a1ace0c2f59bc1129fe537fd97c
BLAKE2b-256 5c021f35af34753de8af1d9fbcfec2490cb8ac0d7b658a1e87841cf17b4f43e5

See more details on using hashes here.

File details

Details for the file grami_ai-0.3.120-py3-none-any.whl.

File metadata

  • Download URL: grami_ai-0.3.120-py3-none-any.whl
  • Upload date:
  • Size: 13.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for grami_ai-0.3.120-py3-none-any.whl
Algorithm Hash digest
SHA256 6e379bb6810a37e33082aabbb704710d717da45c680c9313aa1457cab02d34da
MD5 ab8216dbfc2bc40e81cedb4576c4b078
BLAKE2b-256 ab8c3059bc7ecbf7ef46121c5b495b3fbeb08b8678e4cdcd8e70939038c304cc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page