A tool to generate an AI BOM from source code.
Project description
AI BOM
The AI BOM tool scans codebases and container images to inventory AI framework components (models, agents, tools, prompts, and more). It currently parses Python source code, resolves fully qualified symbols, and matches them against a DuckDB catalog to produce an AI bill of materials (AI BOM). Optional LLM enrichment extracts model names, and a workflow pass annotates components with call-path context.
Table of Contents
- Features
- Repository Layout
- Installation
- Knowledge Base Configuration
- Usage
- Testing
- Output Formats
- UI Mode
- Technical Details
- Troubleshooting
Features
- Static Python analysis: Uses
libcstto capture assignments, decorators, type annotations, and context managers. - Container image scanning: Extracts
/appfrom Docker images when available, otherwise scanssite-packages. - DuckDB catalog matching: Maps fully qualified symbols to curated component categories.
- Workflow context: Builds a lightweight call graph to show which workflows reach each component.
- Derived relationships: Infers
USES_TOOLandUSES_LLMlinks from agent arguments. - Optional LLM enrichment: Uses
litellmto extract model/embedding names from code snippets. - Multiple outputs: Plaintext, JSON, or a FastAPI UI server.
- Report submission: Optional POST of the JSON report with retries.
Repository Layout
aibom/ # Python analyzer package + CLI
ui/ # React UI for exploring results
docs/ # UI/API documentation
Installation
Prerequisites
- Python 3.11+
- uv (Python package manager, recommended)
- Docker (optional, for container image analysis)
- Node.js 22+ (optional, for the React UI)
- LLM provider API key (optional, for model extraction)
Installing as a CLI tool
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# or: brew install uv
uv tool install --python 3.13 cisco-aibom
# Verify installation
cisco-aibom --help
Alternatively, install from source:
uv tool install --python 3.13 --from git+https://github.com/cisco-ai-defense/aibom cisco-aibom
# Verify installation
cisco-aibom --help
Installing for local development
git clone https://github.com/cisco-ai-defense/aibom.git
cd aibom/aibom
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# or: brew install uv
uv sync
# Activate virtual environment
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# Verify installation
cisco-aibom --help
When working from source, you can also run the CLI with uv run cisco-aibom ... or uv run python -m aibom ....
Knowledge Base Configuration
The analyzer uses a local DuckDB catalog described by manifest.json.
The DuckDB file is a prebuilt, versioned knowledge-catalog artifact of AI frameworks. It is used as a read-only lookup dataset, with checksum verification for compatibility and integrity.
For users running the packaged CLI (for example via uv tool install or pip), the packaged manifest provides a default checksum and default catalog location (~/.aibom/catalogs/aibom_catalog-<version>.duckdb). You can still override with AIBOM_DB_PATH and AIBOM_DB_SHA256.
When running from source, execute from the aibom/ directory or set AIBOM_MANIFEST_PATH to point at aibom/src/aibom/manifest.json.
Download the DuckDB artifact from GitHub Releases
# Set this to the release tag that matches your catalog artifact (example: 0.2.2)
VERSION="<version>"
mkdir -p "${HOME}/.aibom/catalogs"
# Option 1: GitHub CLI
gh release download "${VERSION}" \
--repo cisco-ai-defense/aibom \
--pattern "aibom_catalog-${VERSION}.duckdb" \
--dir "${HOME}/.aibom/catalogs"
# Option 2: direct download URL
curl -fL \
-o "${HOME}/.aibom/catalogs/aibom_catalog-${VERSION}.duckdb" \
"https://github.com/cisco-ai-defense/aibom/releases/download/${VERSION}/aibom_catalog-${VERSION}.duckdb"
Provide the DuckDB path to the analyzer
export AIBOM_DB_PATH="${HOME}/.aibom/catalogs/aibom_catalog-${VERSION}.duckdb"
# Set only if your file is different from the manifest default (for example,
# custom path/version) or if you see a checksum mismatch error:
# export AIBOM_DB_SHA256="<sha256-of-${AIBOM_DB_PATH}>"
Compute SHA-256 when needed:
# macOS
shasum -a 256 "${AIBOM_DB_PATH}"
# Linux
sha256sum "${AIBOM_DB_PATH}"
Use only the hash value (first column) as AIBOM_DB_SHA256.
Override settings with environment variables:
AIBOM_DB_PATH: local DuckDB file pathAIBOM_DB_SHA256: SHA-256 checksum for the DuckDB file
AIBOM_DB_PATH may be absolute or relative. Relative env-var values are resolved from the current working directory; relative duckdb_file values in manifest.json are resolved from the manifest directory.
Usage
Analyze sources
# Local directory (JSON output)
cisco-aibom analyze /path/to/project --output-format json --output-file report.json
# Container image (JSON output)
cisco-aibom analyze langchain-app:latest --output-format json --output-file report.json
# Multiple images from a JSON list
cisco-aibom analyze --images-file images.json --output-format plaintext --output-file report.txt
--output-file is required for plaintext and json output formats.
Render a JSON report
cisco-aibom report report.json --raw-json
Optional LLM enrichment
cisco-aibom analyze /path/to/project \
--output-format json \
--output-file report.json \
--llm-model gpt-3.5-turbo \
--llm-api-base https://api.openai.com/v1 \
--llm-api-key $OPENAI_API_KEY
Local LLM example:
cisco-aibom analyze /path/to/project \
--output-format json \
--output-file report.json \
--llm-model ollama_chat/gemma3:12b \
--llm-api-base http://localhost:11434
Optional report submission
cisco-aibom analyze /path/to/project \
--output-format json \
--output-file report.json \
--post-url https://api.security.cisco.com/api/ai-defense/v1/aibom/analysis \
--ai-defense-api-key $AI_DEFENSE_API_KEY
You can also set AIBOM_POST_URL instead of --post-url and AI_DEFENSE_API_KEY instead of --ai-defense-api-key.
The API key is sent as the x-cisco-ai-defense-tenant-api-key header. Use the same path in every region:
/api/ai-defense/v1/aibom/analysis.
Choose the base domain for your Cisco AI Defense organization's region:
- US:
https://api.security.cisco.com/api/ai-defense/v1/aibom/analysis - APJ:
https://api.apj.security.cisco.com/api/ai-defense/v1/aibom/analysis - EU:
https://api.eu.security.cisco.com/api/ai-defense/v1/aibom/analysis - UAE:
https://api.uae.security.cisco.com/api/ai-defense/v1/aibom/analysis
Testing
cd aibom
uv run pytest tests -v
Output Formats
Plaintext output
--- AI BOM Analysis Report ---
--- Results for source: langchain-app:latest ---
[+] Found 4 MODEL:
- Name: langchain_community.llms.openai.OpenAI
Model: gpt-3.5-turbo-instruct
Source: /app/comprehensive_langchain_app.py:32
...
--- End of Report: Found 42 total components across all sources. ---
JSON output
{
"aibom_analysis": {
"metadata": {
"run_id": "...",
"analyzer_version": "<analyzer-version>",
"started_at": "2025-01-01T00:00:00Z",
"completed_at": "2025-01-01T00:00:10Z"
},
"sources": {
"langchain-app:latest": {
"components": {
"model": [
{
"name": "langchain_community.llms.openai.OpenAI",
"file_path": "/app/app.py",
"line_number": 32,
"category": "model",
"model_name": "gpt-3.5-turbo",
"workflows": []
}
]
},
"relationships": [
{
"source_instance_id": "...",
"target_instance_id": "...",
"label": "USES_LLM",
"source_name": "...",
"target_name": "...",
"source_category": "agent",
"target_category": "model"
}
],
"workflows": [
{
"id": "...",
"function": "module.flow",
"file_path": "/app/app.py",
"line": 10,
"distance": 0
}
],
"total_components": 42,
"total_workflows": 7,
"summary": {
"status": "completed",
"source_kind": "container"
}
}
},
"summary": {
"total_sources": 1,
"total_components": 42,
"total_relationships": 3,
"total_workflows": 7,
"categories": {
"model": 4,
"tool": 8
}
},
"errors": []
}
}
UI Mode
--output-format ui starts a FastAPI server that serves the analyzed components:
cisco-aibom analyze /path/to/project --output-format ui
Endpoints:
GET /api/componentsGET /api/components/typesGET /api/components/{id}GET /health
The React UI in ui/ can connect to this server. See docs/UI_README.md and docs/API_SERVER_README.md for details.
Technical Details
- Parsing:
libcstextracts fully qualified names for calls, decorators, type annotations, and context managers. - Catalog matching: Symbols are matched against the DuckDB
component_catalogtable using their fully qualified IDs. - Workflow analysis: The AST-based workflow analyzer associates components with the functions that call into them.
- Relationships: Agent arguments are inspected for tool/LLM references to derive
USES_TOOLandUSES_LLMlinks. - LLM enrichment:
litellmis used only when--llm-modelis supplied.
Troubleshooting
- DuckDB catalog errors: Ensure the catalog file exists at
AIBOM_DB_PATH(orduckdb_filein manifest) and thatAIBOM_DB_SHA256(orduckdb_sha256in manifest) matches the file checksum. When running from source, execute fromaibom/or setAIBOM_MANIFEST_PATH. - Docker issues: Container analysis requires a working Docker CLI and daemon.
- LLM configuration errors:
--llm-api-baseis required whenever--llm-modelis set. - UI server does not start: If no components are found, the UI server exits early. Verify the target includes AI framework usage.
- Missing output files:
--output-fileis mandatory forplaintextandjsonformats.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cisco_aibom-0.2.2.tar.gz.
File metadata
- Download URL: cisco_aibom-0.2.2.tar.gz
- Upload date:
- Size: 50.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5b57027a51b08e8593d4e75e79198b48a8298fa1d079e952e62a74e0cee08f82
|
|
| MD5 |
18617db2fb86feb71df0573f974f7633
|
|
| BLAKE2b-256 |
d15fbf157d76e23275462fceee45e08c9da7a605b1f4c2247e315eb5d980b387
|
Provenance
The following attestation bundles were made for cisco_aibom-0.2.2.tar.gz:
Publisher:
publish-to-pypi.yml on cisco-ai-defense/aibom
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
cisco_aibom-0.2.2.tar.gz -
Subject digest:
5b57027a51b08e8593d4e75e79198b48a8298fa1d079e952e62a74e0cee08f82 - Sigstore transparency entry: 942899141
- Sigstore integration time:
-
Permalink:
cisco-ai-defense/aibom@59031d76385be7b23b71b9b0459caa671343deb2 -
Branch / Tag:
refs/tags/0.2.2 - Owner: https://github.com/cisco-ai-defense
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-to-pypi.yml@59031d76385be7b23b71b9b0459caa671343deb2 -
Trigger Event:
push
-
Statement type:
File details
Details for the file cisco_aibom-0.2.2-py3-none-any.whl.
File metadata
- Download URL: cisco_aibom-0.2.2-py3-none-any.whl
- Upload date:
- Size: 46.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2127e8255280f8a5572fe04edce4e0861d3cdf3333f5f30e282e946cdf62daf2
|
|
| MD5 |
10104254b9bb87cc36e499aa3679f6d9
|
|
| BLAKE2b-256 |
6f3bad54354ed8a2f8585a5989ec4c822a79fe647a2e9308a48eb52d522e0569
|
Provenance
The following attestation bundles were made for cisco_aibom-0.2.2-py3-none-any.whl:
Publisher:
publish-to-pypi.yml on cisco-ai-defense/aibom
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
cisco_aibom-0.2.2-py3-none-any.whl -
Subject digest:
2127e8255280f8a5572fe04edce4e0861d3cdf3333f5f30e282e946cdf62daf2 - Sigstore transparency entry: 942899151
- Sigstore integration time:
-
Permalink:
cisco-ai-defense/aibom@59031d76385be7b23b71b9b0459caa671343deb2 -
Branch / Tag:
refs/tags/0.2.2 - Owner: https://github.com/cisco-ai-defense
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-to-pypi.yml@59031d76385be7b23b71b9b0459caa671343deb2 -
Trigger Event:
push
-
Statement type: