Skip to main content

Source code quality evaluation platform powered by AI

Project description

Quodeq

AI-powered code quality and security scanner

v1.0.6

Tests MIT License PyPI

Watch the 2-min demo · Website · Blog · Releases


AI models can now autonomously find and exploit zero-day vulnerabilities across operating systems, browsers, and web applications. Thousands of previously unknown flaws uncovered in weeks, not years.

The code you ship today will be read by models that can spot what humans miss. But the tools to prepare for this are locked behind enterprise contracts and partner programs.

Quodeq exists to change that.

Open source. MIT license. Runs locally. No telemetry. No account. No servers.

Scans any codebase with AI across six quality dimensions from ISO 25010: Security, Reliability, Maintainability, Performance, Flexibility, and Usability.

Every finding maps to a CWE identifier. You get grades, violations with line numbers, and a fix plan. Cloud providers (Claude, Gemini, Codex) for speed. Local models via Ollama for privacy.


What It Finds

CRITICAL  src/db.py:15        SQL Injection via string concatenation     CWE-89
          query = f"SELECT * FROM users WHERE id = {user_id}"

HIGH      src/auth.py:42      Hardcoded credentials in source code       CWE-798
          credentials = {"user": "admin", "pass": "secret123"}

MEDIUM    src/api.py:88       Missing rate limiting on login endpoint     CWE-307
          @app.route("/login", methods=["POST"])

MINOR     src/utils.py:23     Bare except clause hides errors             CWE-396
          except: pass

Each finding includes a reason, the offending code, and a fix plan. Results are stored as JSON on your machine.


Getting Started

pipx install quodeq    # Install quodeq
quodeq                 # Launch the dashboard

Running quodeq opens the dashboard, where you can point to any project and run evaluations from the UI. Also available via pip install quodeq.

Requirements: Python 3.12+ and Node.js 18+ (for the dashboard UI).

macOS App (beta)

Download the .dmg from Releases, open it, and drag Quodeq.app to Applications. On first launch:

xattr -cr /Applications/Quodeq.app    # Required for unsigned apps

Or right-click the app, select Open, then click Open in the dialog.


Dashboard

Quodeq Dashboard


  • Grades and scores per dimension with A-F letter grades, numeric scores, and trends across runs
  • Violations explorer to drill into findings by file, principle, or CWE classification
  • Code map showing a visual heatmap of where issues concentrate in your codebase
  • Custom standards to create your own evaluation dimensions or import from the library

Click any dimension, file, or principle to explore the details. Dismiss false positives directly from the UI.

Running quodeq is equivalent to quodeq dashboard. Both open the same UI.

CLI

quodeq evaluate /path/to/project
quodeq evaluate /path/to/project --scope src/api    # Scoped to a subdirectory
quodeq evaluate /path/to/project -d security        # Single dimension

AI Providers

Choose what fits your workflow. Configure in Settings from the dashboard.

Provider Type Getting started
Ollama Local Free, private, code never leaves your machine
Claude Code Cloud Best balance of speed, quality, and cost
Codex CLI Cloud OpenAI models
Gemini CLI Cloud Google models

For local analysis we recommend Gemma 4 (gemma4:26b). Reducing the context window to 32k still gives good results and allows running multiple subagents in parallel.


How It Works

  1. Detect languages, frameworks, and project structure
  2. Analyze with AI agents that read the code using read-only tools
  3. Collect findings as structured JSONL via tool calls
  4. Score against ISO 25010 principles with CWE classifications
  5. Report per-dimension grades, violations, compliance, and fix plans

Results are stored in ~/.quodeq/evaluations/ and persist across sessions. Works with any language. The AI analysis engine reads and understands code regardless of the tech stack.

Quodeq scores each principle on a 0 to 10 scale using four independent constraints. Full details in the scoring formula documentation.

Standards

By default, Quodeq evaluates the six ISO 25010 dimensions. It also ships with Clean Architecture and Domain-Driven Design standards. You can create your own from the dashboard, or ask any AI to generate one as a .json file and import it.


Development

git clone https://github.com/quodeq/quodeq.git && cd quodeq
uv sync
uv run pytest

Changelog

See CHANGELOG.md for release history.

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quodeq-1.0.6.tar.gz (26.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quodeq-1.0.6-py3-none-any.whl (861.4 kB view details)

Uploaded Python 3

File details

Details for the file quodeq-1.0.6.tar.gz.

File metadata

  • Download URL: quodeq-1.0.6.tar.gz
  • Upload date:
  • Size: 26.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quodeq-1.0.6.tar.gz
Algorithm Hash digest
SHA256 662ec29335b30486bb96b24a6c5e7755e04aef3ee0ad58024142675960b90b7f
MD5 c11c69635d9afa16e51c005ce72f0bbe
BLAKE2b-256 caf5c16d95614755bd75fb3013929fc2f1af53e76dff46b6690d4e8348155e8d

See more details on using hashes here.

Provenance

The following attestation bundles were made for quodeq-1.0.6.tar.gz:

Publisher: publish.yml on quodeq/quodeq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file quodeq-1.0.6-py3-none-any.whl.

File metadata

  • Download URL: quodeq-1.0.6-py3-none-any.whl
  • Upload date:
  • Size: 861.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for quodeq-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 e0217778b2bd0906221aec2fb8f93912ac50ec49e5b8e4b39bd3d76ac8344561
MD5 d6ebe26463ef92025ce5abd05f154d78
BLAKE2b-256 8d5222613f41454cc48e074bdcaa33715c736c2c656c8a865184b88fd3c9ae3e

See more details on using hashes here.

Provenance

The following attestation bundles were made for quodeq-1.0.6-py3-none-any.whl:

Publisher: publish.yml on quodeq/quodeq

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page