Skip to main content

Source code quality evaluation platform powered by AI

Project description

Quodeq

AI-powered code quality and security scanner

v1.0.5

Tests MIT License PyPI

Watch the 2-min demo · Website · Blog · Releases


AI models can now autonomously find and exploit zero-day vulnerabilities across operating systems, browsers, and web applications. Thousands of previously unknown flaws uncovered in weeks, not years.

The code you ship today will be read by models that can spot what humans miss. But the tools to prepare for this are locked behind enterprise contracts and partner programs.

Quodeq exists to change that.

Open source. MIT license. Runs locally. No telemetry. No account. No servers.

Scans any codebase with AI across six quality dimensions from ISO 25010: Security, Reliability, Maintainability, Performance, Flexibility, and Usability.

Every finding maps to a CWE identifier. You get grades, violations with line numbers, and a fix plan. Cloud providers (Claude, Gemini, Codex) for speed. Local models via Ollama for privacy.


What It Finds

CRITICAL  src/db.py:15        SQL Injection via string concatenation     CWE-89
          query = f"SELECT * FROM users WHERE id = {user_id}"

HIGH      src/auth.py:42      Hardcoded credentials in source code       CWE-798
          credentials = {"user": "admin", "pass": "secret123"}

MEDIUM    src/api.py:88       Missing rate limiting on login endpoint     CWE-307
          @app.route("/login", methods=["POST"])

MINOR     src/utils.py:23     Bare except clause hides errors             CWE-396
          except: pass

Each finding includes a reason, the offending code, and a fix plan. Results are stored as JSON on your machine.


Getting Started

pipx install quodeq    # Install quodeq
quodeq                 # Launch the dashboard

Running quodeq opens the dashboard, where you can point to any project and run evaluations from the UI. Also available via pip install quodeq.

Requirements: Python 3.12+ and Node.js 18+ (for the dashboard UI).

macOS App (beta)

Download the .dmg from Releases, open it, and drag Quodeq.app to Applications. On first launch:

xattr -cr /Applications/Quodeq.app    # Required for unsigned apps

Or right-click the app, select Open, then click Open in the dialog.


Dashboard

Quodeq Dashboard


  • Grades and scores per dimension with A-F letter grades, numeric scores, and trends across runs
  • Violations explorer to drill into findings by file, principle, or CWE classification
  • Code map showing a visual heatmap of where issues concentrate in your codebase
  • Custom standards to create your own evaluation dimensions or import from the library

Click any dimension, file, or principle to explore the details. Dismiss false positives directly from the UI.

Running quodeq is equivalent to quodeq dashboard. Both open the same UI.

CLI

quodeq evaluate /path/to/project
quodeq evaluate /path/to/project --scope src/api    # Scoped to a subdirectory
quodeq evaluate /path/to/project -d security        # Single dimension

AI Providers

Choose what fits your workflow. Configure in Settings from the dashboard.

Provider Type Getting started
Ollama Local Free, private, code never leaves your machine
Claude Code Cloud Best balance of speed, quality, and cost
Codex CLI Cloud OpenAI models
Gemini CLI Cloud Google models

For local analysis we recommend Gemma 4 (gemma4:26b). Reducing the context window to 32k still gives good results and allows running multiple subagents in parallel.


How It Works

  1. Detect languages, frameworks, and project structure
  2. Analyze with AI agents that read the code using read-only tools
  3. Collect findings as structured JSONL via tool calls
  4. Score against ISO 25010 principles with CWE classifications
  5. Report per-dimension grades, violations, compliance, and fix plans

Results are stored in ~/.quodeq/evaluations/ and persist across sessions. Works with any language. The AI analysis engine reads and understands code regardless of the tech stack.

Quodeq scores each principle on a 0 to 10 scale using four independent constraints. Full details in the scoring formula documentation.

Standards

By default, Quodeq evaluates the six ISO 25010 dimensions. It also ships with Clean Architecture and Domain-Driven Design standards. You can create your own from the dashboard, or ask any AI to generate one as a .json file and import it.


Development

git clone https://github.com/quodeq/quodeq.git && cd quodeq
uv sync
uv run pytest

Changelog

See CHANGELOG.md for release history.

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quodeq-1.0.5.tar.gz (26.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quodeq-1.0.5-py3-none-any.whl (30.1 MB view details)

Uploaded Python 3

File details

Details for the file quodeq-1.0.5.tar.gz.

File metadata

  • Download URL: quodeq-1.0.5.tar.gz
  • Upload date:
  • Size: 26.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quodeq-1.0.5.tar.gz
Algorithm Hash digest
SHA256 d3765c3174df14de366576d2e6a7fba2818741ca3df651bdd8590152104472ef
MD5 626a95e68ce9061c57b5f61ee7090ef0
BLAKE2b-256 4279c0b7c94b6849b8c1580436cc4b1ab45fd43160e6028e19460e72c4eac092

See more details on using hashes here.

Provenance

The following attestation bundles were made for quodeq-1.0.5.tar.gz:

Publisher: publish.yml on quodeq/quodeq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file quodeq-1.0.5-py3-none-any.whl.

File metadata

  • Download URL: quodeq-1.0.5-py3-none-any.whl
  • Upload date:
  • Size: 30.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quodeq-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 384624807976582d46ca6479b66ba85a8f509e524fdc6af93df2097297447b7f
MD5 606aca1a9c845e4f32db20818578f30f
BLAKE2b-256 d1a0be262e5b58b743ef5635f2abb204cf26b70751decd11b869e2a4e474e3ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for quodeq-1.0.5-py3-none-any.whl:

Publisher: publish.yml on quodeq/quodeq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page