Skip to main content

Cadence AI - Multi-agent AI orchestration system with plugin management

Project description

Cadence 🤖 Multi-agents AI Framework

A plugin-based multi-agent conversational AI framework built on FastAPI, designed for building intelligent chatbot systems with extensible agent architectures.

🚀 Features

  • Multi-Agent Orchestration: Intelligent routing and coordination between AI agents
  • Plugin System: Extensible architecture for custom agents and tools
  • Multi-LLM Support: OpenAI, Anthropic, Google AI, and more
  • Flexible Storage: PostgreSQL, Redis, MongoDB, and in-memory backends
  • REST API: FastAPI-based API with automatic documentation
  • Streamlit UI: Built-in web interface for testing and management
  • Docker Support: Containerized deployment with Docker Compose

📦 Installation & Usage

🎯 For End Users (Quick Start)

Install the package:

pip install cadence-py

Verify installation:

# Check if cadence is available
python -m cadence --help

# Should show available commands and options

Run the application:

# Start the API server
python -m cadence start api

# Start with custom host/port
python -m cadence start api --host 0.0.0.0 --port 8000

# Start the Streamlit UI
python -m cadence start ui

# Start both API and UI
python -m cadence start all

Available commands:

# Show help
python -m cadence --help

# Show status
python -m cadence status

# Manage plugins
python -m cadence plugins

# Show configuration
python -m cadence config

# Health check
python -m cadence health

🛠️ For Developers (Build from Source)

If you want to contribute, develop plugins, or customize the framework:

Prerequisites

  • Python 3.13+
  • Poetry (for dependency management)
  • Docker (optional, for containerized deployment)

Development Setup

  1. Clone the repository

    git clone https://github.com/jonaskahn/cadence.git
    cd cadence
    
  2. Install dependencies

    poetry install
    poetry install --with local  # Include local SDK development
    
  3. Set up environment variables

    cp .env.example .env
    # Edit .env with your API keys and configuration
    
  4. Run the application

    poetry run python -m cadence start api
    

⚙️ Configuration

Environment Variables

All configuration is done through environment variables with the CADENCE_ prefix:

# LLM Provider Configuration
CADENCE_DEFAULT_LLM_PROVIDER=openai
CADENCE_OPENAI_API_KEY=your-openai-key
CADENCE_ANTHROPIC_API_KEY=your-claude-key
CADENCE_GOOGLE_API_KEY=your-gemini-key

# Storage Configuration
CADENCE_CONVERSATION_STORAGE_BACKEND=memory  # or postgresql
CADENCE_POSTGRES_URL=postgresql://user:pass@localhost/cadence

# Plugin Configuration
CADENCE_PLUGINS_DIR=["./plugins/src/cadence_plugins"]

# Server Configuration
CADENCE_API_HOST=0.0.0.0
CADENCE_API_PORT=8000
CADENCE_DEBUG=true

# Advanced Configuration
CADENCE_MAX_AGENT_HOPS=25
CADENCE_MAX_TOOL_HOPS=50
CADENCE_GRAPH_RECURSION_LIMIT=50

# Session Management
CADENCE_SESSION_TIMEOUT=3600
CADENCE_MAX_SESSION_HISTORY=100

Configuration File

You can also use a .env file for local development:

# .env
CADENCE_DEFAULT_LLM_PROVIDER=openai
CADENCE_OPENAI_API_KEY=your_actual_openai_api_key_here
CADENCE_ANTHROPIC_API_KEY=your_actual_claude_api_key_here
CADENCE_GOOGLE_API_KEY=your_actual_gemini_api_key_here

CADENCE_APP_NAME="Cadence 🤖 Multi-agents AI Framework"
CADENCE_DEBUG=false

CADENCE_PLUGINS_DIR=./plugins/src/cadence_example_plugins

CADENCE_API_HOST=0.0.0.0
CADENCE_API_PORT=8000

# For production, you might want to use PostgreSQL
CADENCE_CONVERSATION_STORAGE_BACKEND=postgresql
CADENCE_POSTGRES_URL=postgresql://user:pass@localhost/cadence

# For development, you can use the built-in UI
CADENCE_UI_HOST=0.0.0.0
CADENCE_UI_PORT=8501

🚀 Usage

Command Line Interface

Cadence provides a comprehensive CLI for management tasks:

# Start the server
python -m cadence start api --host 0.0.0.0 --port 8000

# Show status
python -m cadence status

# Manage plugins
python -m cadence plugins

# Show configuration
python -m cadence config

# Health check
python -m cadence health

API Usage

The framework exposes a REST API for programmatic access:

import requests

# Send a message
response = requests.post("http://localhost:8000/api/v1/chat", json={
    "message": "Hello, how are you?",
    "user_id": "user123",
    "org_id": "org456"
})

print(response.json())

Plugin Development

Create custom agents and tools using the Cadence SDK:

from cadence_sdk.base.agent import Agent
from cadence_sdk.base.tools import Tool

class MyAgent(Agent):
    name = "my_agent"
    description = "A custom agent for specific tasks"
    
    def process(self, message: str) -> str:
        return f"Processed: {message}"

class MyTool(Tool):
    name = "my_tool"
    description = "A custom tool for specific operations"
    
    def execute(self, **kwargs) -> str:
        return "Tool executed successfully"

🐳 Docker Deployment

Quick Start with Docker Compose

# Start all services
docker-compose -f docker/compose.yaml up -d

# View logs
docker-compose -f docker/compose.yaml logs -f

# Stop services
docker-compose -f docker/compose.yaml down

Custom Docker Build

# Build the image
./build.sh

# Run the container
docker run -p 8000:8000 ifelsedotone/cadence:latest

🧪 Testing

Run the test suite to ensure everything works correctly:

# Install test dependencies
poetry install --with dev

# Run tests
poetry run pytest

# Run with coverage
poetry run pytest --cov=src/cadence

# Run specific test categories
poetry run pytest -m "unit"
poetry run pytest -m "integration"

📚 Documentation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

📞 Support


Made with ❤️ by the Cadence AI Team

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cadence_py-1.0.1.tar.gz (72.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cadence_py-1.0.1-py3-none-any.whl (102.9 kB view details)

Uploaded Python 3

File details

Details for the file cadence_py-1.0.1.tar.gz.

File metadata

  • Download URL: cadence_py-1.0.1.tar.gz
  • Upload date:
  • Size: 72.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for cadence_py-1.0.1.tar.gz
Algorithm Hash digest
SHA256 61be05af8fd11abbd831f7d9e9967cd0c43807a90492c26106f09756740bb9dd
MD5 a7839fe0cb142477b40cddea4912c8cb
BLAKE2b-256 97c0388a1484e3aa81cd31429922cc4a97c3e3684f7c13f8fef653259ba98d0a

See more details on using hashes here.

File details

Details for the file cadence_py-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: cadence_py-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 102.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for cadence_py-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e2376e890f1c58ea5c5858e941721456f5063aff03917b1700211c6871170a3f
MD5 7fd6c49bdc74f39220158be4f99e5820
BLAKE2b-256 37115361170a63a9daa4f34a84863ce058626193135f9b5ab5b559edc8fff522

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page