Skip to main content

Lightweight AI agent framework for research synthesis

Project description

Ambivo Agents v2.0.0

A lightweight AI agent framework for research synthesis with quality-gated multi-source knowledge gathering.

What's Different

v2.0.0 is a focused rewrite. We removed 7 commodity agents and their heavy dependencies (Docker, Redis, Playwright) to focus on what's actually unique: quality-gated knowledge synthesis — iteratively consulting multiple sources, assessing response quality, and refining until a threshold is met.

Metric v1.x v2.0.0
Agents 14 7
Core deps 12 (redis, docker, lz4...) 9 (httpx, bs4, openai...)
Requires Redis Yes No (in-memory default)
Requires Docker Yes (5 agents) No
Requires Playwright Yes (400MB browsers) No (API-based scraping)

Quick Start

pip install ambivo-agents
from ambivo_agents import AssistantAgent

agent = AssistantAgent.create_simple(user_id="demo")
response = await agent.chat("What is quantum computing?")
print(response)
await agent.cleanup_session()

No Redis. No Docker. No config files. Just works.

Available Agents

Agent Purpose
AssistantAgent General conversation and explanations
ModeratorAgent Intelligent query routing to specialized agents
KnowledgeSynthesisAgent Multi-source research with quality assessment loops
WebSearchAgent Web search via Brave/AVES APIs
WebScraperAgent Content extraction via Jina Reader/Firecrawl/requests+bs4
KnowledgeBaseAgent Document ingestion and semantic search (Qdrant + LlamaIndex)
GatherAgent Conversational form filling with conditional logic

Core Differentiator: Quality-Gated Synthesis

from ambivo_agents import KnowledgeSynthesisAgent

agent = KnowledgeSynthesisAgent.create_simple(user_id="researcher")
# Automatically: searches web -> scrapes pages -> queries KB -> assesses quality -> refines
response = await agent.chat("What are the latest advances in quantum error correction?")

The synthesis pipeline:

  1. Analyzes query to select search strategy
  2. Searches web (Brave API) + scrapes relevant pages (Jina Reader API)
  3. Queries knowledge base if available
  4. Assesses response quality (POOR/FAIR/GOOD/EXCELLENT)
  5. If below threshold, gathers more sources and refines
  6. Returns synthesized answer with confidence assessment

Configuration

Minimal (environment variables only)

# Required: at least one LLM provider
export AMBIVO_AGENTS_OPENAI_API_KEY="sk-..."
# or
export AMBIVO_AGENTS_ANTHROPIC_API_KEY="sk-ant-..."

# Optional: web search
export AMBIVO_AGENTS_BRAVE_API_KEY="..."

# Optional: scraping API (free Jina Reader works without key)
export AMBIVO_AGENTS_JINA_API_KEY="..."
export AMBIVO_AGENTS_FIRECRAWL_API_KEY="..."

YAML config (agent_config.yaml)

llm:
  preferred_provider: "anthropic"
  anthropic_api_key: "sk-ant-..."

agent_capabilities:
  enable_web_search: true
  enable_web_scraping: true
  enable_knowledge_base: false  # requires Qdrant

web_search:
  brave_api_key: "..."

web_scraping:
  scraping:
    provider: "jina"  # jina | firecrawl | requests
    jina_api_key: null  # optional, for higher rate limits

Memory System

Default: in-memory (zero infrastructure). For distributed deployments:

pip install ambivo-agents[redis]
export AMBIVO_AGENTS_REDIS_HOST="localhost"

The framework automatically uses Redis when available, falls back to in-memory.

Optional Extras

pip install ambivo-agents[redis]       # Distributed memory (Redis + lz4)
pip install ambivo-agents[aws]         # AWS Bedrock LLM support
pip install ambivo-agents[knowledge]   # Knowledge base (Qdrant + LlamaIndex, Python 3.11-3.12)
pip install ambivo-agents[documents]   # Document processing (PDF, DOCX, PPTX)
pip install ambivo-agents[async]       # Async utilities (aiohttp, aiofiles)
pip install ambivo-agents[full]        # All runtime extras
pip install ambivo-agents[all-ml]      # Everything including knowledge base

Agent Creation Patterns

# Simple (recommended)
agent = AssistantAgent.create_simple(user_id="user123")
response = await agent.chat("Hello!")
await agent.cleanup_session()

# With explicit context
agent, context = AssistantAgent.create(user_id="user123")
print(f"Session: {context.session_id}")

# ModeratorAgent auto-routes to the right agent
moderator = ModeratorAgent.create_simple(user_id="user123")
response = await moderator.chat("Search for AI news")  # -> WebSearchAgent
response = await moderator.chat("Scrape https://example.com")  # -> WebScraperAgent

Web Scraping (API-Based)

No browsers, no Docker. Uses HTTP APIs:

Provider Cost JS Support Default
Jina Reader Free tier Yes Yes
Firecrawl Paid Yes No
requests+bs4 Free No Fallback
from ambivo_agents import WebScraperAgent

scraper = WebScraperAgent.create_simple(user_id="demo")
response = await scraper.chat("scrape https://example.com")

Migration from v1.x

Removed agents (use alternatives)

v1.x Agent Alternative
CodeExecutorAgent OpenAI code_interpreter, E2B
MediaEditorAgent Direct FFmpeg, external service
YouTubeDownloadAgent yt-dlp directly
DatabaseAgent Direct DB drivers
AnalyticsAgent pandas/DuckDB directly
APIAgent LLM tool calling
WorkflowDeveloperAgent Direct workflow API

Breaking changes

  • redis and docker removed from core dependencies
  • create_redis_memory_manager() still works but create_memory_manager() is preferred
  • InMemoryMemoryManager is the new default when Redis is unavailable
  • WebScraperAgent uses HTTP APIs instead of Playwright/Docker
  • Many environment variables renamed/removed (see Configuration section)

Development

# Install dev dependencies
pip install -e ".[dev]"

# Run tests
pytest tests/ -v

# Format code
black ambivo_agents/ --line-length=100
isort ambivo_agents/ --profile black --line-length=100

License

MIT License - See LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ambivo_agents-2.0.0.tar.gz (195.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ambivo_agents-2.0.0-py3-none-any.whl (200.7 kB view details)

Uploaded Python 3

File details

Details for the file ambivo_agents-2.0.0.tar.gz.

File metadata

  • Download URL: ambivo_agents-2.0.0.tar.gz
  • Upload date:
  • Size: 195.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for ambivo_agents-2.0.0.tar.gz
Algorithm Hash digest
SHA256 43497701b4a9bad60637ace8acd1ad9abff19f79e231bf772186ba22707aeab6
MD5 da4f8c71f32e3487850047c63b198a86
BLAKE2b-256 be5942d0ebfed28b57dd1f8b17c573cbed3a302acfab65e4d085e6c972b5a41c

See more details on using hashes here.

File details

Details for the file ambivo_agents-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: ambivo_agents-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 200.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.2

File hashes

Hashes for ambivo_agents-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 01ea310c1fb890e43561c60056cc471c991febdeed86022268917d21c61117cb
MD5 d0b7204e852bfa13bdd5bd894fb9ccbb
BLAKE2b-256 f87f9671b3678edc7d1e70484ec60363712f558bde527653d71f229378f3ef46

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page