Skip to main content

Cadence AI - Multi-agent AI orchestration system with plugin management

Project description

Cadence 🤖 Multi-agents AI Framework

A plugin-based multi-agent conversational AI framework built on FastAPI, designed for building intelligent chatbot systems with extensible agent architectures.

Cadence Demo

🚀 Features

  • Multi-Agent Orchestration: Intelligent routing and coordination between AI agents
  • Plugin System: Extensible architecture for custom agents and tools
  • Multi-LLM Support: OpenAI, Anthropic, Google AI, and more
  • Flexible Storage: PostgreSQL, Redis, MongoDB, and in-memory backends
  • REST API: FastAPI-based API with automatic documentation
  • Streamlit UI: Built-in web interface for testing and management
  • Docker Support: Containerized deployment with Docker Compose

📦 Installation & Usage

🎯 For End Users (Quick Start)

Install the package:

pip install cadence-py

Verify installation:

# Check if cadence is available
python -m cadence --help

# Should show available commands and options

Run the application:

# Start the API server
python -m cadence start api

# Start with custom host/port
python -m cadence start api --host 0.0.0.0 --port 8000

# Start the Streamlit UI
python -m cadence start ui

# Start both API and UI
python -m cadence start all

Available commands:

# Show help
python -m cadence --help

# Show status
python -m cadence status

# Manage plugins
python -m cadence plugins

# Show configuration
python -m cadence config

# Health check
python -m cadence health

🛠️ For Developers (Build from Source)

If you want to contribute, develop plugins, or customize the framework:

Prerequisites

  • Python 3.13+
  • Poetry (for dependency management)
  • Docker (optional, for containerized deployment)

Development Setup

  1. Clone the repository

    git clone https://github.com/jonaskahn/cadence.git
    cd cadence
    
  2. Install dependencies

    poetry install
    poetry install --with local  # Include local SDK development
    
  3. Set up environment variables

    cp .env.example .env
    # Edit .env with your API keys and configuration
    
  4. Run the application

    poetry run python -m cadence start api
    

⚙️ Configuration

Environment Variables

All configuration is done through environment variables with the CADENCE_ prefix:

# LLM Provider Configuration
CADENCE_DEFAULT_LLM_PROVIDER=openai
CADENCE_OPENAI_API_KEY=your-openai-key
CADENCE_ANTHROPIC_API_KEY=your-claude-key
CADENCE_GOOGLE_API_KEY=your-gemini-key

# Storage Configuration
CADENCE_CONVERSATION_STORAGE_BACKEND=memory  # or postgresql
CADENCE_POSTGRES_URL=postgresql://user:pass@localhost/cadence

# Plugin Configuration
CADENCE_PLUGINS_DIR=["./plugins/src/cadence_plugins"]

# Server Configuration
CADENCE_API_HOST=0.0.0.0
CADENCE_API_PORT=8000
CADENCE_DEBUG=true

# Advanced Configuration
CADENCE_MAX_AGENT_HOPS=25
CADENCE_GRAPH_RECURSION_LIMIT=50

# Session Management
CADENCE_SESSION_TIMEOUT=3600
CADENCE_MAX_SESSION_HISTORY=100

Configuration File

You can also use a .env file for local development:

# .env
CADENCE_DEFAULT_LLM_PROVIDER=openai
CADENCE_OPENAI_API_KEY=your_actual_openai_api_key_here
CADENCE_ANTHROPIC_API_KEY=your_actual_claude_api_key_here
CADENCE_GOOGLE_API_KEY=your_actual_gemini_api_key_here

CADENCE_APP_NAME="Cadence 🤖 Multi-agents AI Framework"
CADENCE_DEBUG=false

CADENCE_PLUGINS_DIR=./plugins/src/cadence_example_plugins

CADENCE_API_HOST=0.0.0.0
CADENCE_API_PORT=8000

# For production, you might want to use PostgreSQL
CADENCE_CONVERSATION_STORAGE_BACKEND=postgresql
CADENCE_POSTGRES_URL=postgresql://user:pass@localhost/cadence

# For development, you can use the built-in UI
CADENCE_UI_HOST=0.0.0.0
CADENCE_UI_PORT=8501

🚀 Usage

Command Line Interface

Cadence provides a comprehensive CLI for management tasks:

# Start the server
python -m cadence start api --host 0.0.0.0 --port 8000

# Show status
python -m cadence status

# Manage plugins
python -m cadence plugins

# Show configuration
python -m cadence config

# Health check
python -m cadence health

API Usage

The framework exposes a REST API for programmatic access:

import requests

# Send a message
response = requests.post("http://localhost:8000/api/v1/chat", json={
    "message": "Hello, how are you?",
    "user_id": "user123",
    "org_id": "org456"
})

print(response.json())

Plugin Development

Create custom agents and tools using the Cadence SDK:

from cadence_sdk import BaseAgent, BasePlugin, PluginMetadata, tool

class MyPlugin(BasePlugin):
    @staticmethod
    def get_metadata() -> PluginMetadata:
        return PluginMetadata(
            name="my_agent",
            version="1.0.0",
            description="My custom AI agent",
            capabilities=["custom_task"],
            agent_type="specialized",
            dependencies=["cadence_sdk>=1.0.2,<2.0.0"],
        )

    @staticmethod
    def create_agent() -> BaseAgent:
        return MyAgent(MyPlugin.get_metadata())

class MyAgent(BaseAgent):
    def __init__(self, metadata: PluginMetadata):
        super().__init__(metadata)

    def get_tools(self):
        from .tools import my_custom_tool
        return [my_custom_tool]

    def get_system_prompt(self) -> str:
        return "You are a helpful AI assistant."

@tool
def my_custom_tool(input_data: str) -> str:
    """A custom tool for specific operations."""
    return f"Processed: {input_data}"

🐳 Docker Deployment

Quick Start with Docker Compose

# Start all services
docker-compose -f docker/compose.yaml up -d

# View logs
docker-compose -f docker/compose.yaml logs -f

# Stop services
docker-compose -f docker/compose.yaml down

Custom Docker Build

# Build the image
./build.sh

# Run the container
docker run -p 8000:8000 ifelsedotone/cadence:latest

🧪 Testing

Run the test suite to ensure everything works correctly:

# Install test dependencies
poetry install --with dev

# Run tests
poetry run pytest

# Run with coverage
poetry run pytest --cov=src/cadence

# Run specific test categories
poetry run pytest -m "unit"
poetry run pytest -m "integration"

📚 Documentation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

📞 Support


Made with ❤️ by the Cadence AI Team

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cadence_py-1.0.3.tar.gz (74.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cadence_py-1.0.3-py3-none-any.whl (105.2 kB view details)

Uploaded Python 3

File details

Details for the file cadence_py-1.0.3.tar.gz.

File metadata

  • Download URL: cadence_py-1.0.3.tar.gz
  • Upload date:
  • Size: 74.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for cadence_py-1.0.3.tar.gz
Algorithm Hash digest
SHA256 1921196d9a638f5b88ccb14a4442281c0d88eb40b9a88904be14d63494255c61
MD5 ad0c24a2fc0fd918305b7814ec7bae26
BLAKE2b-256 4af5947ae8cead8a8a8fa0740da9f4bc23429b2b2a426c6d32f8c47f4a04c3ee

See more details on using hashes here.

File details

Details for the file cadence_py-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: cadence_py-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 105.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for cadence_py-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9ec67fb680d049a0af100c816201af92d967086ded6128413dbafb7cad8eb6d7
MD5 85814c173a704b6facd95c7061d02604
BLAKE2b-256 35299c298a0146cb24a2058b01ac343cef640157a3ccffbd0b3a60ef155fdd0d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page