Skip to main content

Remote SQL database agent skills (Postgres + MySQL + SQLite) for Concinno — SQLAlchemy + psycopg + pymysql. DuckDbQuery (local files) in concinno core remains separate.

Project description

concinno-skills-sql

Remote SQL database skills for Concinno. Postgres, MySQL, and local SQLite via SQLAlchemy — no ORM overhead, parameterised :name bind placeholders, DDL firewall, readonly-URL toggle.

Status

MVP (0.1.0) — three tools covering the canonical "connect to a real DB" agent need. Concinno core already ships DuckDbQuery for in-process analytical SQL over local CSV / Parquet / JSON files; this sub-package is the orthogonal "talk to a running DB server" slot.

Tool Library Purpose
PostgresQuery psycopg[binary] + sqlalchemy SELECT / INSERT / UPDATE / DELETE against remote Postgres
MysqlQuery pymysql + sqlalchemy Same, against MySQL / MariaDB
SqliteQuery stdlib sqlite3 + sqlalchemy Same, against a local .db file or :memory:

Install

pip install concinno-skills-sql

All three drivers come in as hard dependencies. If you know you only need one, follow the optional-install path below.

License notes — psycopg[binary] is LGPL-3.0

This package itself is Apache-2.0. However psycopg (the Postgres driver) is licensed under LGPL-3.0. We depend on it as a pip dependency — a dynamic link — so the re-link rights the LGPL requires are preserved for any downstream consumer that pip-installs SQL skills the normal way.

If you statically bundle this package into a single-file binary (e.g. PyInstaller --onefile) the bundle's redistribution terms must satisfy LGPL-3.0 §6 for the psycopg code path specifically. The upstream psycopg project lives at https://github.com/psycopg/psycopg.

Consumers who need a pure-MIT install path can skip the Postgres driver entirely:

pip install --no-deps concinno-skills-sql
pip install 'sqlalchemy>=2.0' 'pymysql>=1.1' 'concinno>=2.15.1'
# Only SqliteQuery and MysqlQuery work; PostgresQuery.call returns a
# friendly "psycopg not installed" error.

Safety

Every tool routes through a shared _safety module before touching the database:

  1. Connection URL scheme allow-list — only postgresql[+driver], mysql[+driver], and sqlite are accepted. ftp:// / file:// / bare strings / foreign dialects are rejected up-front.
  2. SQLite path guardsqlite:///... paths must resolve under Path.home() or Path.cwd() and may not live under system roots (/etc, /Windows, /System, /Program Files, …). :memory: is allowed unconditionally.
  3. DDL firewallDROP, TRUNCATE, ALTER, CREATE, GRANT, REVOKE are regex-rejected in both query and exec paths. Use your DB's migration tool (Alembic / Flyway / Liquibase / raw psql) for schema changes — agent tooling is the wrong surface.
  4. Readonly toggle — append ?readonly=1 to the connection URL to make exec refuse DML, even if the connecting DB user has write privileges. Defense-in-depth.
  5. Row limitquery caps at limit=100 by default, hard max 10_000. Larger pagination must use OFFSET / LIMIT inside the SQL statement.

Parameter binding (:name placeholders with a params dict) is the real defense against SQL injection. Guards above are additional belt-and-braces for the common LLM mistake of f-stringing a user input into the SQL body.

Credentials

No CredentialStore integration inside this package — connection URLs are supplied per call, which is the realistic shape for agents that juggle multiple databases in a single session.

Callers who prefer a credential indirection:

import os
from concinno_skills_sql import PostgresQuery

PostgresQuery().call(
    action="query",
    connection_url=os.environ["DATABASE_URL"],
    sql="SELECT * FROM users WHERE id = :id",
    params={"id": 42},
)

Or, via Concinno's CredentialStore:

from concinno.core.credentials import CredentialStore

cs = CredentialStore()
url = cs.resolve({"$ref": "env:DATABASE_URL"})
PostgresQuery().call(action="query", connection_url=url, sql="...")

Keeping URL resolution outside the tool means the audit trail of "which call touched which DB" lives in the caller's log, not hidden inside the Tool.

Usage via Concinno ToolRegistry

When the consumer sets CONCINNO_LOAD_PLUGINS=1, the default registry auto-mounts all three tools:

import os
os.environ["CONCINNO_LOAD_PLUGINS"] = "1"

from concinno.tools.registry import get_default_registry

reg = get_default_registry()
for name in ("PostgresQuery", "MysqlQuery", "SqliteQuery"):
    assert name in reg.list_deferred()

Direct Python usage

from concinno_skills_sql import PostgresQuery, MysqlQuery, SqliteQuery

# Postgres — SELECT with a bind parameter.
PostgresQuery().call(
    action="query",
    connection_url="postgresql+psycopg://user:pw@db.internal:5432/app",
    sql="SELECT id, name, email FROM users WHERE signup_date > :since",
    params={"since": "2026-04-01"},
    limit=500,
)
# → {"rows": [...], "count": ..., "columns": ["id", "name", "email"]}

# MySQL — INSERT with a bind parameter; readonly URL blocks DML.
MysqlQuery().call(
    action="exec",
    connection_url="mysql+pymysql://u:pw@db:3306/app",
    sql="INSERT INTO audit_log (user_id, event) VALUES (:uid, :ev)",
    params={"uid": 42, "ev": "login"},
)
# → {"ok": True, "rowcount": 1}

# SQLite — a local application DB under ~/.myapp/state.db.
SqliteQuery().call(
    action="query",
    connection_url="sqlite:///~/myapp/state.db?readonly=1",
    sql="SELECT key, value FROM kv WHERE key LIKE :pat",
    params={"pat": "user:%"},
)

All tools return either {"ok": True, ...} / {"rows": ..., "count": ..., "columns": ...} on success or {"error": "..."} on any validation or library failure — matching the shape of other Concinno built-in tools.

What this package is NOT

  • Not a replacement for concinno.DuckDbQuery — that one targets analytical queries over CSV / Parquet / JSON files via read_csv_auto-style pseudo-tables. Use DuckDbQuery when the data is in files; use these tools when the data is in a running DB server.
  • Not a migration tool — DDL is rejected by design. Use Alembic / Flyway / Liquibase / hand-written psql for schema changes.
  • Not a full ORM surface — intentionally SQL-in, rows-out. Agents needing a mapped model should call SQLAlchemy Core / ORM directly in a hand-written Tool.
  • Not a secrets storeconnection_url is per-call.

License

Apache-2.0. See LICENSE in the Concinno monorepo.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

concinno_skills_sql-0.1.0.tar.gz (42.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

concinno_skills_sql-0.1.0-py3-none-any.whl (19.4 kB view details)

Uploaded Python 3

File details

Details for the file concinno_skills_sql-0.1.0.tar.gz.

File metadata

  • Download URL: concinno_skills_sql-0.1.0.tar.gz
  • Upload date:
  • Size: 42.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.9

File hashes

Hashes for concinno_skills_sql-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b5e139d0877508547728f72edb216600db59ecc2c3f0f815ef3ce29d77bf522c
MD5 8a5086ce8fb15dd46fc3b2ea8fefa73e
BLAKE2b-256 2ce90427c143843ca58d38aa310f1badfa031e5e66489b505fac6548df892447

See more details on using hashes here.

File details

Details for the file concinno_skills_sql-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for concinno_skills_sql-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9339d598b3242f1bd50158d1d3e988e3c0c62c69e0a4f2a9a9eab4f6347f9272
MD5 07cbc649d87094ec29240f882d0df83e
BLAKE2b-256 9d4e45240315cd778e3665c46bc97ab012db16d9e94654055006cab9ab5d169d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page