Open-source agentic AI framework with multi-provider support, workflow orchestration, and extensible tool system
Project description
Victor
Open-source agentic AI framework. Build, orchestrate, and evaluate AI agents across 22 providers.
Features
┌─────────────────────────────────────────────────────────────┐
│ VICTOR FRAMEWORK │
│ │
│ Agents ─── Teams ─── Workflows ─── Evaluation │
│ │ │ │ │ │
│ run() Sequential StateGraph SWE-bench │
│ stream() Parallel YAML DSL Harnesses │
│ chat() Hierarchical Checkpoints Code Quality │
│ Pipeline │
│ │
│ 24 Providers │ 34 Tool Modules │ 9 Verticals │ 4 Scopes │
└─────────────────────────────────────────────────────────────┘
- 24 LLM Providers — Cloud (Anthropic, OpenAI, Google, Azure, Bedrock, DeepSeek, Vertex) + local (Ollama, LM Studio, vLLM)
- 34 Tool Modules — File ops, git, shell, web, search, docker, testing, refactoring, analysis
- 9 Domain Verticals — Coding, DevOps, RAG, Data Analysis, Research, Security, IaC, Classification, Benchmark
- Multi-Agent Teams — 4 formations: sequential, parallel, hierarchical, pipeline
- Stateful Workflows — YAML DSL compiled to StateGraph with typed state and checkpointing
- Air-Gapped Mode — Full functionality with local models for secure, offline environments
- Built-in Resilience — Automatic retry with exponential backoff on rate limits, circuit breaker protection
Benchmark Results (March 2026)
Victor achieves 100% task success rate across multiple providers:
| Provider | Model | 5-Task Success | Avg Time/Task | Cost/1M tokens |
|---|---|---|---|---|
| Anthropic | Claude Haiku 4.5 | 100% | 16.9s | $0.80 in |
| OpenAI | GPT-4o-mini | 100% | 14.7s | $0.15 in |
| DeepSeek | DeepSeek-Chat V3 | 100% | 35.9s | $0.07 in |
Tasks: code generation, research synthesis, file operations, security audit, workflow orchestration. See full results
At a glance
┌─────────────────────────────────┐
│ Agent Orchestrator │
│ │
[You] ──▶ [CLI/TUI/API] ──▶ │ ProviderManager ──▶ 24 LLMs │ ──▶ [Response]
│ ToolPipeline ──▶ 34 Tools │
│ TeamCoordinator ──▶ Agents │
│ StateManager ──▶ 4 Scopes │
└─────────────────────────────────┘
Choose your path
| Persona | Start here | Typical goals |
|---|---|---|
| New user | Getting Started | Install, first run, local vs cloud |
| Daily user | User Guide | Commands, modes, profiles, workflows |
| Operator | Operations | Deployment, monitoring, security |
| Contributor | Development | Setup, testing, architecture, extending |
| Architect | Architecture | System overview, core components |
Quick start
| Path | Commands | Best for |
|---|---|---|
| Local model | pipx install victor-aiollama pull qwen2.5-coder:7bvictor chat "Hello" |
Privacy, offline, free tier |
| Cloud model | pipx install victor-aiexport ANTHROPIC_API_KEY=...victor chat --provider anthropic |
Max capability |
| Docker | docker pull ghcr.io/vjsingh1984/victor:latestdocker run -it -v ~/.victor:/root/.victor ghcr.io/vjsingh1984/victor:latest |
Isolated env |
Supported Providers
Victor supports 22 LLM providers — switch mid-conversation without losing context.
| Category | Providers |
|---|---|
| Frontier Cloud | Anthropic, OpenAI, Google Gemini, Azure OpenAI |
| Cloud Platforms | AWS Bedrock, Google Vertex |
| Specialized | xAI, DeepSeek, Mistral, Groq, Cerebras, Moonshot, ZAI |
| Aggregators | OpenRouter, Together AI, Fireworks AI, Replicate, Hugging Face |
| Local (air-gapped) | Ollama, LM Studio, vLLM, llama.cpp |
Python API
Victor provides a clean Python API for programmatic use:
from victor.framework import Agent, EventType
# Simple use case
agent = await Agent.create(provider="anthropic")
result = await agent.run("Explain this codebase structure")
print(result.content)
# Streaming responses
async for event in agent.stream("Refactor this function"):
if event.type == EventType.CONTENT:
print(event.content, end="")
elif event.type == EventType.TOOL_CALL:
print(f"\nUsing tool: {event.tool_name}")
# With tool configuration
from victor.framework import ToolSet
agent = await Agent.create(
provider="openai",
model="gpt-4o",
tools=ToolSet.default() # or ToolSet.minimal(), ToolSet.full()
)
# Multi-turn conversation
session = agent.chat()
await session.send("What files are in this project?")
await session.send("Now explain the main entry point")
StateGraph Workflows
from victor.framework import StateGraph, END
from typing import TypedDict
class MyState(TypedDict):
query: str
result: str
graph = StateGraph(MyState)
graph.add_node("research", research_fn)
graph.add_node("synthesize", synthesize_fn)
graph.add_edge("research", "synthesize")
graph.add_edge("synthesize", END)
compiled = graph.compile()
result = await compiled.invoke({"query": "AI trends 2025"})
Core capabilities
| Capability | What it means | Docs |
|---|---|---|
| Agent abstractions | run(), stream(), chat(), run_workflow(), run_team() |
Framework |
| 22 Providers | Cloud + local LLMs; switch mid-thread without losing context | Providers |
| 33 Tool modules | File ops, git, shell, web, search, docker, testing, analysis | Tool catalog |
| Workflows | YAML DSL compiled to StateGraph with typed state + checkpointing | Workflows |
| Multi-agent teams | 4 formations: sequential, parallel, hierarchical, pipeline | Multi-agent |
| State management | 4 scopes: workflow, conversation, team, global | State |
| 9 Verticals | Domain-focused agents with tools, prompts, and workflows | Verticals |
| Evaluation | Agent harnesses, code quality analysis, SWE-bench integration | Evaluation |
Command quick reference
| Command | Purpose | Example |
|---|---|---|
victor |
TUI mode | victor |
victor chat |
CLI mode | victor chat "refactor this" |
victor chat --mode plan |
Plan-only analysis | victor chat --mode plan |
victor serve |
HTTP API | victor serve --port 8080 |
victor mcp |
MCP server | victor mcp --stdio |
/provider |
Switch provider in chat | /provider openai --model gpt-4 |
Screenshots
The Victor TUI provides an interactive terminal interface with syntax highlighting and tool status.
CLI mode for quick queries and script integration.
Documentation
Contributing
We welcome contributions. Start with CONTRIBUTING.md and CODE_OF_CONDUCT.md.
Community
Acknowledgments
Victor is built on the shoulders of excellent open-source projects:
- Pydantic - Data validation and settings management
- Tree-sitter - Incremental parsing for code analysis
- Textual - Modern TUI framework
- Typer - CLI interface with type hints
- Rich - Beautiful terminal formatting
- httpx - Async HTTP client
- Anthropic SDK - Claude API client
- OpenAI SDK - OpenAI API client
License
Apache License 2.0 - see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file victor_ai-0.6.0.tar.gz.
File metadata
- Download URL: victor_ai-0.6.0.tar.gz
- Upload date:
- Size: 5.7 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
155eeaf6eaf84db208417be221bfdd0ea6ee7b238394f63824210160a481cbdf
|
|
| MD5 |
964a8683eae0b9daeeeca84f7cb0020c
|
|
| BLAKE2b-256 |
da31e5c76dca04fa9ae8626cde028b8ec303d510aeeb2771e23f769e454ca448
|
Provenance
The following attestation bundles were made for victor_ai-0.6.0.tar.gz:
Publisher:
release.yml on vjsingh1984/victor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
victor_ai-0.6.0.tar.gz -
Subject digest:
155eeaf6eaf84db208417be221bfdd0ea6ee7b238394f63824210160a481cbdf - Sigstore transparency entry: 1220395963
- Sigstore integration time:
-
Permalink:
vjsingh1984/victor@dffc4942f561a0c4acd25885eebb0f26f1632205 -
Branch / Tag:
refs/tags/v0.6.0 - Owner: https://github.com/vjsingh1984
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@dffc4942f561a0c4acd25885eebb0f26f1632205 -
Trigger Event:
push
-
Statement type:
File details
Details for the file victor_ai-0.6.0-py3-none-any.whl.
File metadata
- Download URL: victor_ai-0.6.0-py3-none-any.whl
- Upload date:
- Size: 6.2 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9577f26c41952556f7c2335032482e5864d5f0c2860755c3da8d05478aa9f63b
|
|
| MD5 |
98008c44ab15f293ae19feab1025fcf8
|
|
| BLAKE2b-256 |
745b71b5244451e24d79b2e71154458f5d6c058e3f17645cdaa7ce13b04445b1
|
Provenance
The following attestation bundles were made for victor_ai-0.6.0-py3-none-any.whl:
Publisher:
release.yml on vjsingh1984/victor
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
victor_ai-0.6.0-py3-none-any.whl -
Subject digest:
9577f26c41952556f7c2335032482e5864d5f0c2860755c3da8d05478aa9f63b - Sigstore transparency entry: 1220396001
- Sigstore integration time:
-
Permalink:
vjsingh1984/victor@dffc4942f561a0c4acd25885eebb0f26f1632205 -
Branch / Tag:
refs/tags/v0.6.0 - Owner: https://github.com/vjsingh1984
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@dffc4942f561a0c4acd25885eebb0f26f1632205 -
Trigger Event:
push
-
Statement type: