Skip to main content

A Python library for creating and deploying decentralized AI agents with tools

Project description

DAIE - Decentralized AI Ecosystem

A lightweight Python library for creating and managing AI agents with tools, featuring decentralized communication and memory management.

Features

🚀 Core Features

  • Lightweight Design: Minimal dependencies, optimized for speed and resource efficiency
  • Agent Management: Create, configure, and manage AI agents with unique identities
  • Tool System: Define and register reusable tools for agents to execute
  • Decentralized Communication: Agents communicate via NATS JetStream
  • Memory Management: Agent-specific memories with persistence support
  • LLM Integration: Centralized LLM management with Ollama integration (default: llama3)
  • CLI Interface: Command-line tools for system management

🤖 Agent Features

Each agent has:

  • Unique Identity: ID, name, role, goal, backstory, and system prompt
  • Local Tool Execution: Agents execute tools locally within their own context
  • Chat History: Individual memory stores with working, semantic, and episodic memory
  • Vector Database: Each agent has its own vector database for semantic search (in development)
  • LangGraph Workflow: Each agent has its own LangGraph workflow (in development)
  • LLM from Core: Agents fetch LLM instances from the centralized LLM manager

Installation

Prerequisites

  • Python 3.10+
  • Ollama (for LLM functionality)
  • NATS JetStream (for communication)

Install the Library

pip install daie

Install Ollama

  1. Download and install Ollama from ollama.com
  2. Pull the default model:
    ollama pull llama3
    

Quick Start

Example: Creating a Simple Agent

#!/usr/bin/env python3
import asyncio
import logging
from daie import Agent, AgentConfig, Tool, ToolRegistry
from daie.agents import AgentRole
from daie.tools import tool

# Configure logging
logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)


async def main():
    logger.info("=== DAIE - Decentralized AI Ecosystem Example ===")
    
    # Create a tool
    @tool(
        name="greeting",
        description="Generate a greeting message",
        category="general",
        version="1.0.0"
    )
    async def greeting_tool(name: str, language: str = "en") -> str:
        greetings = {
            "en": f"Hello, {name}! Welcome to DAIE!",
            "es": f"Hola, {name}! ¡Bienvenido a DAIE!",
            "fr": f"Bonjour, {name}! Bienvenue dans DAIE!",
            "de": f"Hallo, {name}! Willkommen bei DAIE!"
        }
        return greetings.get(language.lower(), greetings["en"])
    
    # Create agent configuration with new features
    config = AgentConfig(
        name="ResearchAgent",
        role=AgentRole.SPECIALIZED,
        goal="Research information on given topics",
        backstory="Created to assist with research and information gathering",
        system_prompt="You are a research assistant that helps users find and analyze information.",
        capabilities=["greeting"]
    )
    
    # Create agent
    agent = Agent(config=config)
    agent.add_tool(greeting_tool)
    
    # Test tool execution
    result = await greeting_tool.execute({"name": "Alice", "language": "es"})
    logger.info(f"✅ Tool executed successfully: {result}")
    
    logger.info("\n🎉 Example completed successfully!")


if __name__ == "__main__":
    try:
        asyncio.run(main())
    except Exception as e:
        logger.error(f"❌ Error: {e}")
        import sys
        sys.exit(1)

CLI Usage

Agent Management

# List all agents
daie agent list

# Create a new agent
daie agent create --name "MyAgent" --role "general-purpose" --goal "Help users with questions"

# Start an agent
daie agent start <agent-id>

# Stop an agent
daie agent stop <agent-id>

# Get agent status
daie agent status <agent-id>

# Delete an agent
daie agent delete <agent-id>

Core System Management

# Initialize the system
daie core init

# Start the central core system
daie core start

# Stop the central core system
daie core stop

# Restart the central core system
daie core restart

# Get system status
daie core status

# View system logs
daie core logs

# Check system health
daie core health

LLM Configuration

Setting LLM Parameters

from daie import set_llm, get_llm_config, LLMType

# Using Ollama (default)
set_llm(ollama_llm="llama3")
set_llm(ollama_llm="mistral", temperature=0.3, max_tokens=1500)

# Using OpenAI
set_llm(
    llm_type=LLMType.OPENAI,
    model_name="gpt-3.5-turbo",
    api_key="your-api-key",
    temperature=0.5,
    max_tokens=2000
)

# Get current configuration
config = get_llm_config()
print(f"Current LLM: {config.llm_type.value}/{config.model_name}")
print(f"Temperature: {config.temperature}")
print(f"Max tokens: {config.max_tokens}")

Available LLM Models

Ollama Models:

  • llama3 (default)
  • llama3.2:latest
  • mistral
  • llama2
  • gemma

OpenAI Models:

  • gpt-4o
  • gpt-4o-mini
  • gpt-4-turbo
  • gpt-3.5-turbo

Configuration

Environment Variables

# System configuration
DAIE_LOG_LEVEL=INFO
DAIE_NATS_URL=nats://localhost:4222
DAIE_CENTRAL_CORE_URL=http://localhost:8000

# LLM configuration
DAIE_DEFAULT_LLM_MODEL=llama3
DAIE_LLM_TEMPERATURE=0.7
DAIE_LLM_MAX_TOKENS=1000

# Database configuration
DAIE_DATABASE_URL=sqlite:///:memory:
DAIE_REDIS_URL=redis://localhost:6379/0

Architecture

System Components

  1. Agent: Individual AI entity with specific capabilities
  2. Tool: Reusable functionality that agents can execute
  3. LLM Manager: Handles LLM integration with various providers
  4. Communication Manager: Facilitates agent communication via NATS
  5. Memory Manager: Manages agent memory storage and retrieval
  6. Tool Registry: Central repository for available tools
  7. Central Core System: Orchestrator for the entire ecosystem

Communication Protocol

Agents communicate using NATS JetStream with the following message types:

  • Text Messages: Direct communication between agents
  • Tasks: Requests for tool execution
  • Responses: Results from task execution
  • Events: System and agent events

Development

Prerequisites

  • Python 3.10+
  • Docker (for running dependencies)
  • Poetry (for package management)

Setup

# Clone the repository
git clone https://github.com/decentralized-ai/decentralized-ai-ecosystem.git
cd decentralized-ai-ecosystem

# Install dependencies
poetry install

# Run tests
poetry run pytest tests/

# Run the CLI
poetry run daie --help

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For questions or support, please contact KANISHK KUMAR SINGH at kanishkkumar2004@gmail.com.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

daie-1.0.1.tar.gz (54.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

daie-1.0.1-py3-none-any.whl (52.7 kB view details)

Uploaded Python 3

File details

Details for the file daie-1.0.1.tar.gz.

File metadata

  • Download URL: daie-1.0.1.tar.gz
  • Upload date:
  • Size: 54.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for daie-1.0.1.tar.gz
Algorithm Hash digest
SHA256 bbaf1d1724f8cf3c67157c3b43e351bacf73d520001cada86d4cef4ce0ceb4f0
MD5 486c9eda2bff1a049f4ed27c6d8fcdab
BLAKE2b-256 4b5605ffbcf67a9cb5957123b717fcd6768e854b986f4770cbb12cda7bcbba64

See more details on using hashes here.

File details

Details for the file daie-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: daie-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 52.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for daie-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5ccf77c5c8face045c98dd1f0339491ca7bd2ba0004c4d49353db99b85ebc400
MD5 d12471b3388d6333af055a66d2ae7492
BLAKE2b-256 c876f2599c2f858d3daff5c26942557165df0db2d13fe23e50cc3db7c3ac5fbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page