AgentFoundry: A modular autonomous AI agent framework
Project description
AIgent
AIgent is a modular, extensible AI framework designed to support the construction and orchestration of autonomous agents across a variety of complex tasks. The system is built in Python and leverages modern AI tooling to integrate large language models (LLMs), vector stores, rule-based decision logic, and dynamic tool discovery in secure and performance-conscious environments.
Features
- Modular agent architecture with support for specialization (e.g., memory agents, reactive agents, compliance agents)
- Cython-compiled backend for performance and IP protection
- Integration with popular frameworks such as LangChain, ChromaDB, and OpenAI
- Support for licensed or embedded deployments via license file verification or compiled-only distribution
- Configurable with runtime enforcement of execution licenses (RSA-signed, machine-bound)
Use Cases
AIgent is designed to serve as a core intelligence engine for:
- Secure enterprise AI platforms (e.g., QuantumDrive)
- Compliance monitoring and rule-based alerting systems
- Conversational interfaces with dynamic tool execution
- Embedded agents in SaaS and on-premise environments
Requirements
- Python 3.11+
- Cython
- Compatible dependencies (see
requirements.txt)
Configuration
AgentFoundry supports two configuration paths. The recommended approach is explicit configuration with AgentConfig.
Explicit config (recommended)
from agentfoundry.utils.agent_config import AgentConfig
from agentfoundry.registry.tool_registry import ToolRegistry
from agentfoundry.agents.orchestrator import Orchestrator
config = AgentConfig.from_dict({
"AF_LLM_PROVIDER": "openai",
"AF_OPENAI_API_KEY": "sk-...",
"AF_OPENAI_MODEL": "gpt-4o",
"AF_VECTORSTORE_PROVIDER": "chroma",
"AF_CHROMA_URL": "http://localhost:8000",
})
registry = ToolRegistry(config=config)
registry.load_tools_from_directory()
orchestrator = Orchestrator(registry, config=config)
Legacy config (backward compatibility)
Set a config file explicitly and/or use environment variables:
export AGENTFOUNDRY_CONFIG_FILE="$HOME/.config/agentfoundry/agentfoundry.toml"
export OPENAI_API_KEY="sk-..."
from agentfoundry.utils.agent_config import AgentConfig
config = AgentConfig.from_legacy_config()
See docs/Configuration_Guide.md for full key reference and precedence rules.
Provider notes
- LLM provider is selected by
LLM_PROVIDER(openai,ollama,grok,gemini). OpenAI requiresOPENAI_API_KEYwhen selected. - Vector store provider is selected by
VECTORSTORE_PROVIDER(milvus,chroma,faiss).- Milvus: set
MILVUS_URIorMILVUS_HOST+MILVUS_PORT. - Chroma: set
CHROMA_URL(remote) orCHROMADB_PERSIST_DIR(local). - FAISS: requires an existing index at
FAISS_INDEX_PATH.
- Milvus: set
- ThreadMemory uses OpenAI embeddings by default but falls back to deterministic hash embeddings if
AF_DISABLE_OPENAI_EMBEDDINGS=1or no API key is present. - The DuckDB KGraph backend requires
duckdband its ADBC drivers.
Author
Christopher Steel
AI Practice Lead, AlphaSix Corporation
Founder, Syntheticore, Inc.
Email: csteel@syntheticore.com
Licensing and Legal Notice
© Syntheticore, Inc. All rights reserved.
This software is proprietary and confidential.
Any use, reproduction, modification, distribution, or commercial deployment of AIgent or any part thereof requires explicit written authorization from Syntheticore, Inc.
Unauthorized use is strictly prohibited and may result in legal action.
For licensing inquiries or permission to use this software, please contact:
📧 csteel@syntheticore.com
Gradio Chat Interface
A simple Gradio-based chat interface for interacting with the HybridOrchestrator agent.
Prerequisites
- Ensure you have credentials for your selected LLM provider. For OpenAI:
export OPENAI_API_KEY=<your_api_key>
Running the App
python gradio_app.py
The interface will be available at http://localhost:7860 by default.
API Server
Genie can be accessed programmatically via a FastAPI‑based HTTP API. Two main endpoints are provided:
- POST /v1/chat: Send or continue a multi‑turn conversation with Genie. Accepts JSON payload with conversation history and returns the assistant reply and updated history.
- POST /v1/orchestrate: Discover APIs and execute a main task across all agents. Returns aggregated results.
- GET /health: Health check endpoint.
Prerequisites
- Ensure you have credentials for your selected LLM provider. For OpenAI:
export OPENAI_API_KEY=<your_api_key>
- Install FastAPI and Uvicorn (if not already):
pip install fastapi uvicorn[standard]
Running the API
python api_server.py
# Or with auto‑reload during development:
uvicorn api_server:app --reload --host 0.0.0.0 --port 8000
Interactive API docs will be available at http://localhost:8000/docs
- For Microsoft Graph access (entra_tool), forward the SPA's bearer token in the
Authorization: Bearer <token>header; the API server injects it into the orchestrator config asentra_user_assertionfor on-behalf-of token exchange.
Logging & Debugging
AgentFoundry uses standard Python logging. If the host application does not configure logging,
agentfoundry.utils.logger.get_logger() will create a default log file at ./logs/agentforge.log.
To explicitly control logging, call:
from agentfoundry.utils.logger import setup_logging
setup_logging(level="INFO", logfile="agentfoundry.log")
Quick Smoke Test (Chroma, local persistence)
This verifies vector search without external APIs:
export VECTORSTORE_PROVIDER=chroma
export CHROMADB_PERSIST_DIR="$(mktemp -d)"
python - <<'PY'
from agentfoundry.vectorstores.factory import VectorStoreFactory
vs = VectorStoreFactory.get_store(org_id='smoke')
vs.add_texts(["hello world"], metadatas=[{"org_id":"smoke"}], ids=["1"])
hits = vs.similarity_search("hello", k=1, filter={"org_id":"smoke"})
print("Hits:", [h.page_content for h in hits])
PY
Expected: Hits: ['hello world'].
Notes:
- ThreadMemory falls back to hash embeddings if OpenAI embeddings are unavailable.
- FAISS provider raises if
FAISS_INDEX_PATHdoes not exist; initialize with your ingestion tooling.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file agentfoundry-1.4.19-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.
File metadata
- Download URL: agentfoundry-1.4.19-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
- Upload date:
- Size: 13.8 MB
- Tags: CPython 3.12, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bbf7524594d6bcd8121c8a8b181eacc397635ae4cea5d70de4b51e254d635466
|
|
| MD5 |
2777cf1114716e7909653a630de9327e
|
|
| BLAKE2b-256 |
7625be33bc4dae8d123d9202e2f362fce3205445f48c3b357c72d8e253e2d92a
|