Skip to main content

OmniCoreAgent is a powerful Python AI Agent framework for building autonomous AI agents that think, reason, and execute complex tasks. Production-ready agents that use tools, manage memory, coordinate workflows, and handle real-world business logic.

Project description

OmniCoreAgent Logo

๐Ÿš€ OmniCoreAgent

The AI Agent Framework Built for Production
Switch memory backends at runtime. Manage context automatically. Deploy with confidence.

PyPI Downloads PyPI version Python Version License

Quick Start โ€ข See It In Action โ€ข ๐Ÿ“š Cookbook โ€ข Features โ€ข Docs


๐ŸŽฌ See It In Action

import asyncio
from omnicoreagent import OmniCoreAgent, MemoryRouter, ToolRegistry

# Create tools in seconds
tools = ToolRegistry()

@tools.register_tool("get_weather")
def get_weather(city: str) -> dict:
    """Get current weather for a city."""
    return {"city": city, "temp": "22ยฐC", "condition": "Sunny"}

# Build a production-ready agent
agent = OmniCoreAgent(
    name="assistant",
    system_instruction="You are a helpful assistant with access to weather data.",
    model_config={"provider": "openai", "model": "gpt-4o"},
    local_tools=tools,
    memory_router=MemoryRouter("redis"),  # Start with Redis
    agent_config={
        "context_management": {"enabled": True},  # Auto-manage long conversations
        "guardrail_config": {"strict_mode": True},  # Block prompt injections
    }
)

async def main():
    # Run the agent
    result = await agent.run("What's the weather in Tokyo?")
    print(result["response"])
    
    # Switch to MongoDB at runtime โ€” no restart needed
    await agent.switch_memory_store("mongodb")
    
    # Keep running with a different backend
    result = await agent.run("How about Paris?")
    print(result["response"])

asyncio.run(main())

What just happened?

  • โœ… Registered a custom tool with type hints
  • โœ… Built an agent with memory persistence
  • โœ… Enabled automatic context management
  • โœ… Switched from Redis to MongoDB while running

โšก Quick Start

pip install omnicoreagent
echo "LLM_API_KEY=your_api_key" > .env
from omnicoreagent import OmniCoreAgent

agent = OmniCoreAgent(
    name="my_agent",
    system_instruction="You are a helpful assistant.",
    model_config={"provider": "openai", "model": "gpt-4o"}
)

result = await agent.run("Hello!")
print(result["response"])

That's it. You have an AI agent with session management, memory, and error handling.

๐Ÿ“š Want to learn more? Check out the Cookbook โ€” progressive examples from "Hello World" to production deployments.


๐ŸŽฏ What Makes OmniCoreAgent Different?

Feature What It Means For You
Runtime Backend Switching Switch Redis โ†” MongoDB โ†” PostgreSQL without restarting
Cloud Workspace Storage Agent files persist in AWS S3 or Cloudflare R2 โšก NEW
Context Engineering Session memory + agent loop context + tool offloading = no token exhaustion
Tool Response Offloading Large tool outputs saved to files, 98% token savings
Built-in Guardrails Prompt injection protection out of the box
MCP Native Connect to any MCP server (stdio, SSE, HTTP with OAuth)
Background Agents Schedule autonomous tasks that run on intervals
Workflow Orchestration Sequential, Parallel, and Router agents for complex tasks
Production Observability Metrics, tracing, and event streaming built in

๐ŸŽฏ Core Features

๐Ÿ“– Full documentation: docs-omnicoreagent.omnirexfloralabs.com/docs

# Feature Description Docs
1 OmniCoreAgent The heart of the framework โ€” production agent with all features Overview โ†’
2 Multi-Tier Memory 5 backends (Redis, MongoDB, PostgreSQL, SQLite, in-memory) with runtime switching Memory โ†’
3 Context Engineering Dual-layer system: agent loop context management + tool response offloading Context โ†’
4 Event System Real-time event streaming with runtime switching Events โ†’
5 MCP Client Connect to any MCP server (stdio, streamable_http, SSE) with OAuth MCP โ†’
6 DeepAgent Multi-agent orchestration with automatic task decomposition DeepAgent โ†’
7 Local Tools Register any Python function as an AI tool via ToolRegistry Local Tools โ†’
8 Community Tools 100+ pre-built tools (search, AI, comms, databases, DevOps, finance) Community Tools โ†’
9 Agent Skills Polyglot packaged capabilities (Python, Bash, Node.js) Skills โ†’
10 Workspace Memory Persistent file storage with S3/R2/Local backends Workspace โ†’
11 Sub-Agents Delegate tasks to specialized agents Sub-Agents โ†’
12 Background Agents Schedule autonomous tasks on intervals Background โ†’
13 Workflows Sequential, Parallel, and Router agent orchestration Workflows โ†’
14 BM25 Tool Retrieval Auto-discover relevant tools from 1000+ using BM25 search Advanced Tools โ†’
15 Guardrails Prompt injection protection with configurable sensitivity Guardrails โ†’
16 Observability Per-request metrics + Opik distributed tracing Observability โ†’
17 Universal Models 9 providers via LiteLLM (OpenAI, Anthropic, Gemini, Groq, Ollama, etc.) Models โ†’
18 OmniServe Turn any agent into a production REST/SSE API with one command OmniServe โ†’

๐Ÿ“š Examples & Cookbook

All examples are in the Cookbook โ€” organized by use case with progressive learning paths.

Category What You'll Build Location
Getting Started Your first agent, tools, memory, events cookbook/getting_started
Workflows Sequential, Parallel, Router agents cookbook/workflows
Background Agents Scheduled autonomous tasks cookbook/background_agents
Production Metrics, guardrails, observability cookbook/production
๐Ÿ† Showcase Full production applications cookbook/showcase

๐Ÿ† Showcase: Full Production Applications

Application Description Features
OmniAudit Healthcare Claims Audit System Multi-agent pipeline, ERISA compliance
DevOps Copilot AI-Powered DevOps Automation Docker, Prometheus, Grafana
Deep Code Agent Code Analysis with Sandbox Sandbox execution, session management

โš™๏ธ Configuration

Environment Variables

# Required
LLM_API_KEY=your_api_key

# Optional: Memory backends
REDIS_URL=redis://localhost:6379/0
DATABASE_URL=postgresql://user:pass@localhost:5432/db
MONGODB_URI=mongodb://localhost:27017/omnicoreagent

# Optional: Observability
OPIK_API_KEY=your_opik_key
OPIK_WORKSPACE=your_workspace

Agent Configuration

agent_config = {
    "max_steps": 15,                    # Max reasoning steps
    "tool_call_timeout": 30,            # Tool timeout (seconds)
    "request_limit": 0,                 # 0 = unlimited
    "total_tokens_limit": 0,            # 0 = unlimited
    "memory_config": {"mode": "sliding_window", "value": 10000},
    "enable_advanced_tool_use": True,   # BM25 tool retrieval
    "enable_agent_skills": True,        # Specialized packaged skills
    "memory_tool_backend": "local"      # Persistent working memory
}

๐Ÿ“– Full configuration reference: Configuration Guide โ†’


๐Ÿงช Testing & Development

# Clone
git clone https://github.com/omnirexflora-labs/omnicoreagent.git
cd omnicoreagent

# Setup
uv venv && source .venv/bin/activate
uv sync --dev

# Test
pytest tests/ -v
pytest tests/ --cov=src --cov-report=term-missing

๐Ÿ” Troubleshooting

Error Fix
Invalid API key Check .env: LLM_API_KEY=your_key
ModuleNotFoundError pip install omnicoreagent
Redis connection failed Start Redis or use MemoryRouter("in_memory")
MCP connection refused Ensure MCP server is running

๐Ÿ“– More troubleshooting: Basic Usage Guide โ†’


๐Ÿ“ Changelog

See the full Changelog โ†’ for version history.


๐Ÿค Contributing

# Fork & clone
git clone https://github.com/omnirexflora-labs/omnicoreagent.git

# Setup
uv venv && source .venv/bin/activate
uv sync --dev
pre-commit install

# Submit PR

See CONTRIBUTING.md for guidelines.


๐Ÿ“„ License

MIT License โ€” see LICENSE


๐Ÿ‘จโ€๐Ÿ’ป Author & Credits

Created by Abiola Adeshina

๐ŸŒŸ The OmniRexFlora Ecosystem

Project Description
๐Ÿง  OmniMemory Self-evolving memory for autonomous agents
๐Ÿค– OmniCoreAgent Production-ready AI agent framework (this project)
โšก OmniDaemon Event-driven runtime engine for AI agents

๐Ÿ™ Acknowledgments

Built on: LiteLLM, FastAPI, Redis, Opik, Pydantic, APScheduler


Building the future of production-ready AI agent frameworks

โญ Star us on GitHub โ€ข ๐Ÿ› Report Bug โ€ข ๐Ÿ’ก Request Feature โ€ข ๐Ÿ“– Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omnicoreagent-0.3.8.tar.gz (268.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omnicoreagent-0.3.8-py3-none-any.whl (390.6 kB view details)

Uploaded Python 3

File details

Details for the file omnicoreagent-0.3.8.tar.gz.

File metadata

  • Download URL: omnicoreagent-0.3.8.tar.gz
  • Upload date:
  • Size: 268.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for omnicoreagent-0.3.8.tar.gz
Algorithm Hash digest
SHA256 3ab072eccbce8f4969e4f31c7c8efbba15d3822fd66ef6f8241d2679db2b87ad
MD5 173cadc77fd3d6450ffec224d2988bb0
BLAKE2b-256 46903b4f3075c4279c9de96a51ceccfd0a79247083f6ab172320fbbce37d766d

See more details on using hashes here.

File details

Details for the file omnicoreagent-0.3.8-py3-none-any.whl.

File metadata

  • Download URL: omnicoreagent-0.3.8-py3-none-any.whl
  • Upload date:
  • Size: 390.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for omnicoreagent-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 691d566f8ca28ad47b0fef737f3502e322e2770b5f112f2c9c93ce72b6a3cec1
MD5 f060743f1080da250b2fc57f2bfa865c
BLAKE2b-256 d0a2b77438d240b9ab9070b0c27a33af54c90066d29c7e3f4aa473db703feb17

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page