Scan repositories to detect GenAI and Agentic AI applications, identify LLM usage, and analyze agent intent
Project description
Quin Agent Scanner
Nothing in your codebase is hidden from Quin.
Named after Bao Qingtian -- the incorruptible judge of the Song Dynasty who saw through every deception -- Quin is an open-source CLI tool by Gaincontrol that scans any codebase to detect AI agents, extract system prompts, classify risk, and produce compliance-ready reports.
Point Quin at a repo and get back: every AI agent, what it does, what tools it has, and what risks it carries -- in a single HTML report.
Learn more at gaincontrol.ai/quin.
Note: This is version 0.1.0b2 -- an early public release under active development. Scan results are provided as-is and may be incomplete, inaccurate, or contain false positives/negatives. Always review and validate findings independently before making security, compliance, or architectural decisions based on them. The authors and contributors of Quin accept no responsibility or liability for any actions taken, or not taken, based on the output of this tool. If you encounter any issues, please open an issue or email us at pixiedust@gaincontrol.ai -- your feedback helps us improve the scanner.
About Gaincontrol
Quin is built by Gaincontrol, headquartered in Singapore. We build the infrastructure enterprises need to run AI agents safely.
| Product | What it does |
|---|---|
| Quin | AI Agent Scanner -- see every agent, trust nothing blindly |
| Aegis | AI Identity Governance -- every agent operates only within its granted authority |
| Drona | Safe Execution Fabric -- deterministic execution for probabilistic AI |
- Website: gaincontrol.ai
- Contact: pixiedust@gaincontrol.ai
Quick Start
1. Install
pip install quin-scanner
Or with uv:
uv tool install quin-scanner
2. Set up your .env
Create a .env file in your working directory:
# Pick one LLM provider:
ANTHROPIC_API_KEY=sk-ant-...
# OPENAI_API_KEY=sk-...
# GOOGLE_API_KEY=...
# For scanning GitHub repos or orgs:
GITHUB_TOKEN=ghp_...
# Optional — vulnerability web search (OSV.dev is always on).
# Reuses the chosen provider's API key; set PERPLEXITY_API_KEY if you pick perplexity.
# VULN_SEARCH_PROVIDER=anthropic # perplexity | gemini | openai | anthropic | none
# PERPLEXITY_API_KEY=pplx-...
# Optional — custom base URL for OpenAI-compatible endpoints (vLLM, LiteLLM, Azure, Ollama).
# OPENAI_COMPATIBLE_URL=http://localhost:11434/v1
3. Set up scanner-config.yaml
curl -O https://raw.githubusercontent.com/Gaincontrol-Pte-Ltd/quin-agent-scanner/main/scanner-config.yaml
The default config uses Anthropic and outputs HTML:
llm:
provider: anthropic # openai | anthropic | google | ollama | openai-compatible
model: claude-haiku-4-5-20251001
# api_key_env: ANTHROPIC_API_KEY # reads from your .env
output:
format: html # html | json | yaml
vuln_check:
enabled: true # OSV.dev lookup for detected framework+version
search_provider: anthropic # perplexity | gemini | openai | anthropic | none
# search_model: sonar-pro # optional override; provider defaults otherwise
osv_timeout_seconds: 30
web_timeout_seconds: 60
scanners:
enabled:
- dependency
- config
- code_pattern
- file_structure
- framework
- prompt_discovery
- dockerfile
- jupyter
- iac
- ci
- mcp
- agent_instance
- tool_definition
4. Run your first scan
# Scan a local repo -- generates an HTML report in ./report/
quin-scanner scan ./path/to/repo --config scanner-config.yaml
# Scan a GitHub repo
quin-scanner scan https://github.com/org/repo --config scanner-config.yaml
# Static-only scan (no LLM, no API key needed)
quin-scanner scan ./path/to/repo --config scanner-config.yaml --no-llm
# Skip the vulnerability lookup, or pick a different web-search provider
quin-scanner scan ./path/to/repo --config scanner-config.yaml --no-vuln-check
quin-scanner scan ./path/to/repo --config scanner-config.yaml --vuln-search-provider perplexity
Demo
Sample Report Walkthrough:
Scan an Entire GitHub Org
quin-scanner scan-org my-github-org \
--config scanner-config.yaml \
--skip-archived \
--skip-forks
This discovers all repos in the org via the GitHub API, scans each one, and writes per-repo HTML reports to ./report/.
Options:
-o, --output [json|yaml|html] Output format (default: html)
--output-dir PATH Directory for reports (default: ./report/)
--skip-archived Skip archived repositories
--skip-forks Skip forked repositories
--no-llm Skip LLM analysis
--config PATH Path to scanner-config.yaml
Requires GITHUB_TOKEN with repo + read:org scopes. Create one at github.com/settings/tokens.
Supported Frameworks
Quin detects AI usage across these frameworks and SDKs:
| Framework | Language |
|---|---|
| LangChain / LangGraph | Python, Node.js |
| CrewAI | Python |
| AutoGen | Python |
| MetaGPT | Python |
| OpenAI Agents SDK | Python |
| Anthropic Agent SDK | Python |
| Google ADK | Python |
| Databricks Agent Framework | Python |
| LlamaIndex | Python, Node.js |
| Haystack | Python |
| Semantic Kernel | Python, Node.js |
| Dify | Python |
| Flowise | Node.js |
| PromptFlow | Python |
| Vercel AI SDK | TypeScript |
| MCP (Model Context Protocol) | Any |
| OpenClaw | Node.js |
| PydanticAI | Python |
| DSPy / Guidance / Outlines | Python |
| OpenAI SDK | Python, Node.js, Go, Rust, Java |
| Anthropic SDK | Python, Node.js |
| Google Generative AI | Python, Node.js |
| Transformers / Diffusers | Python |
| LiteLLM | Python |
Also detects vector databases (ChromaDB, Pinecone, Qdrant, Weaviate, FAISS, pgvector, Milvus), embedding providers (Cohere, Sentence Transformers, Hugging Face), and voice/image AI packages (ElevenLabs, Whisper, Stable Diffusion).
Installation
Prerequisites
- Python 3.11+ -- download
- An LLM API key -- OpenAI, Anthropic, Google, or a local Ollama model
- git -- for scanning GitHub repos
- GitHub token (optional) -- for private repos or org scanning
From PyPI
pip install quin-scanner
From source
git clone https://github.com/Gaincontrol-Pte-Ltd/quin-agent-scanner
cd quin-agent-scanner
curl -LsSf https://astral.sh/uv/install.sh | sh
uv sync --all-extras
cp .env.example .env # edit with your API keys
uv run quin-scanner scan ./path/to/repo --config scanner-config.yaml
LLM Providers
| Provider | Config value | Default Model | API Key Env Var |
|---|---|---|---|
| Anthropic | anthropic |
claude-haiku-4-5-20251001 |
ANTHROPIC_API_KEY |
| OpenAI | openai |
gpt-4o-mini |
OPENAI_API_KEY |
google |
gemini-2.0-flash |
GOOGLE_API_KEY |
|
| Ollama (local) | ollama |
llama3.2 |
-- |
| OpenAI-compatible | openai-compatible |
(set with --llm-model) |
OPENAI_API_KEY |
How It Works
Quin runs 13 scanner plugins in parallel, looks up known CVEs for the detected framework, then uses a two-pass LLM pipeline:
Repo --> 13 Scanners (parallel) --> Vulnerability Lookup --> Pass 1: Classification --> Pass 2: Synthesis --> Report
- 13 scanners detect dependencies, code patterns, configs, prompts, frameworks, tools, agents, MCP servers, Dockerfiles, notebooks, CI pipelines, and infrastructure-as-code
- Vulnerability lookup -- once the agentic framework and its base version are identified (e.g.
CrewAI 0.80.0), the scanner queries OSV.dev and optionally an LLM with web search for recent advisories. Critical/high findings are promoted into risk signals - Pass 1 (Classification) -- an LLM classifies the system type (
standard_ai,agentic_ai,mcp_enabled,multi_agent) and identifies relevant threats from a taxonomy sourced from OWASP LLM Top 10, OWASP Agentic Top 10, OWASP MCP Top 10, MAESTRO, and Databricks DASF - Pass 2 (Synthesis) -- a second LLM call profiles each agent with taxonomy-grounded risk indicators, maps tool usages to service categories, and generates a narrative summary
Use --no-llm to skip both LLM passes and run scanners only. Use --no-vuln-check to skip the vulnerability lookup.
Vulnerability Lookup
When a framework and its base version are detected, Quin checks for known CVEs and promotes critical/high-severity findings into the report's risk signals. All findings are listed under vulnerabilities in the report.
| Source | When it runs | Auth |
|---|---|---|
| OSV.dev | Always, when vuln_check.enabled: true |
None |
| LLM web search | Optional, when search_provider is set |
Reuses the chosen provider's API key |
Supported web-search providers (each reuses its own existing API key env var):
| Provider | Env var | Default model |
|---|---|---|
perplexity |
PERPLEXITY_API_KEY |
sonar-pro |
gemini |
GOOGLE_API_KEY |
gemini-2.0-flash |
openai |
OPENAI_API_KEY |
gpt-4o-mini |
anthropic |
ANTHROPIC_API_KEY |
claude-haiku-4-5-20251001 |
none |
-- | disables web search (OSV still runs) |
Precedence: --vuln-search-provider CLI flag > VULN_SEARCH_PROVIDER env var > vuln_check.search_provider in YAML.
Contributing
See CONTRIBUTING.md.
License
Apache-2.0 -- see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file quin_scanner-0.1.0b2.tar.gz.
File metadata
- Download URL: quin_scanner-0.1.0b2.tar.gz
- Upload date:
- Size: 4.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
14abd28fa33018003f651f4343eae387d27ec6bf2ef2ca1735be041aca15659a
|
|
| MD5 |
e28a887bb7190b68a7a6ad2c7895e9fa
|
|
| BLAKE2b-256 |
2f15a585c9e79e8cd17c1e59d3cd3bad564d9ca1976a5f59fa46e4f7f76268df
|
Provenance
The following attestation bundles were made for quin_scanner-0.1.0b2.tar.gz:
Publisher:
publish.yml on Gaincontrol-Pte-Ltd/quin-agent-scanner
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
quin_scanner-0.1.0b2.tar.gz -
Subject digest:
14abd28fa33018003f651f4343eae387d27ec6bf2ef2ca1735be041aca15659a - Sigstore transparency entry: 1410239341
- Sigstore integration time:
-
Permalink:
Gaincontrol-Pte-Ltd/quin-agent-scanner@d6d09024b211f3a42b7d6ce8e35ff64dbe7f603d -
Branch / Tag:
refs/tags/v0.1.0b2 - Owner: https://github.com/Gaincontrol-Pte-Ltd
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d6d09024b211f3a42b7d6ce8e35ff64dbe7f603d -
Trigger Event:
push
-
Statement type:
File details
Details for the file quin_scanner-0.1.0b2-py3-none-any.whl.
File metadata
- Download URL: quin_scanner-0.1.0b2-py3-none-any.whl
- Upload date:
- Size: 139.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f289f0634f594b508771dd85550972053da0e5203ff896c518f40d742e055a48
|
|
| MD5 |
257af806b903c4186075e61051ede9db
|
|
| BLAKE2b-256 |
a96701de49f97192d8366a97fc645d029054b2e28361ac152b030bc025d3c972
|
Provenance
The following attestation bundles were made for quin_scanner-0.1.0b2-py3-none-any.whl:
Publisher:
publish.yml on Gaincontrol-Pte-Ltd/quin-agent-scanner
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
quin_scanner-0.1.0b2-py3-none-any.whl -
Subject digest:
f289f0634f594b508771dd85550972053da0e5203ff896c518f40d742e055a48 - Sigstore transparency entry: 1410239425
- Sigstore integration time:
-
Permalink:
Gaincontrol-Pte-Ltd/quin-agent-scanner@d6d09024b211f3a42b7d6ce8e35ff64dbe7f603d -
Branch / Tag:
refs/tags/v0.1.0b2 - Owner: https://github.com/Gaincontrol-Pte-Ltd
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d6d09024b211f3a42b7d6ce8e35ff64dbe7f603d -
Trigger Event:
push
-
Statement type: