Skip to main content

A dynamic and flexible AI agent framework for building intelligent, multi-modal AI agents

Project description

GRAMI-AI: Dynamic AI Agent Framework

Version Python Versions License GitHub Stars

Overview

GRAMI-AI is a cutting-edge, async-first AI agent framework designed to solve complex computational challenges through intelligent, collaborative agent interactions. Built with unprecedented flexibility, this library empowers developers to create sophisticated, context-aware AI systems that can adapt, learn, and collaborate across diverse domains.

Key Features

  • Async AI Agent Creation
  • Multi-LLM Support (Gemini, OpenAI, Anthropic, Ollama)
  • Extensible Tool Ecosystem
  • Multiple Communication Interfaces
  • Flexible Memory Management
  • Secure and Scalable Architecture

Installation

Using pip

pip install grami-ai

From Source

git clone https://github.com/YAFATEK/grami-ai.git
cd grami-ai
pip install -e .

Quick Start

Basic Async Agent Creation

import asyncio
from grami.agent import AsyncAgent
from grami.providers.gemini_provider import GeminiProvider

async def main():
    # Initialize a Gemini-powered Async Agent
    agent = AsyncAgent(
        name="AssistantAI",
        llm=GeminiProvider(api_key="YOUR_API_KEY"),
        system_instructions="You are a helpful digital assistant."
    )

    # Send an async message
    response = await agent.send_message("Hello, how can you help me today?")
    print(response)

    # Stream a response
    async for token in agent.stream_message("Tell me a story"):
        print(token, end='', flush=True)

asyncio.run(main())

Example Configurations

1. Async Agent with Memory and Streaming

from grami.agent import AsyncAgent
from grami.providers.gemini_provider import GeminiProvider
from grami.memory.lru import LRUMemory

agent = AsyncAgent(
    name="MemoryStreamingAgent",
    llm=provider,
    memory=LRUMemory(capacity=100),
    system_instructions="You are a storyteller."
)

2. Async Agent without Memory

agent = AsyncAgent(
    name="NoMemoryAgent",
    llm=provider,
    memory=None,
    system_instructions="You are a concise assistant."
)

3. Async Agent with Streaming Disabled

response = await agent.send_message("Tell me about AI")

4. Async Agent with Streaming Enabled

async for token in agent.stream_message("Explain quantum computing"):
    print(token, end='', flush=True)

Roadmap and TODO

Core Framework

  • Async-first design
  • Multi-provider support
  • Dynamic agent creation
  • Flexible memory management

Memory and State Management

  • Pluggable memory providers
    • In-memory state storage
    • LRU Memory implementation
  • Async memory operations
  • Persistent memory storage
  • Advanced memory indexing

Provider Integrations

  • Gemini Provider
  • OpenAI Provider
  • Anthropic Provider
  • Ollama Provider

Security and Performance

  • Enhanced encryption for API keys
  • Rate limiting mechanisms
  • Secure communication protocols
  • Performance optimization for large-scale deployments

Contributing

Contributions are welcome! Please check our GitHub repository for guidelines.

Support


2024 YAFATEK. All Rights Reserved.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grami_ai-0.3.127.tar.gz (12.3 kB view details)

Uploaded Source

Built Distribution

grami_ai-0.3.127-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file grami_ai-0.3.127.tar.gz.

File metadata

  • Download URL: grami_ai-0.3.127.tar.gz
  • Upload date:
  • Size: 12.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for grami_ai-0.3.127.tar.gz
Algorithm Hash digest
SHA256 80a2276eed32f92e6e50bc878314eff1ca5a2ba1cc9857a38f1b0c8848ba085a
MD5 51f75b1b855b859f2dd488503d62d5c0
BLAKE2b-256 45da7f3c3285f03261db6bfd1ea7caafb1b8e870db86047d85b029cd45fd2340

See more details on using hashes here.

File details

Details for the file grami_ai-0.3.127-py3-none-any.whl.

File metadata

  • Download URL: grami_ai-0.3.127-py3-none-any.whl
  • Upload date:
  • Size: 16.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for grami_ai-0.3.127-py3-none-any.whl
Algorithm Hash digest
SHA256 4726dc01b7895256b1bb06e0a15482e831dc9481f498fff60d74736659b32933
MD5 abd92423d204660cfa9b8c7c9e816e08
BLAKE2b-256 b9c90cb967d01b474b98c3996dac012b4b09b3757b2442eb8561bf35ed9f3a3d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page