Generate Datasette SQL queries using plain language and an LLM
Project description
datasette-llm-sql-writer
Generate Datasette SQL queries using plain language and an LLM.
This plugin adds an "LLM SQL Writer" panel to Datasette pages that helps you write read‑only SQL (SELECT/CTE) from a natural‑language prompt. It keeps a per‑database chat history, lets you copy or re‑run previously generated queries, and respects Datasette’s built‑in SQL editor when present.
Installation
Install this plugin in the same environment as Datasette.
datasette install datasette-llm-sql-writer
Usage
- Start Datasette with a database
Provide any SQLite database file when starting Datasette. Navigate to a table page to see the LLM panel above the table. On non‑table pages the panel is inserted below the first page header so you can still generate and run queries.
datasette path/to/your.db -p 8001
# Then visit http://127.0.0.1:8001/your/tablename
- Configure the LLM model (optional but recommended)
By default, the plugin targets OpenAI and uses gpt-5-mini unless you override it. You can configure a model via Datasette metadata or an environment variable (see below):
metadata.json:
{
"plugins": {
"datasette-llm-sql-writer": {
"model": "gpt-5-mini"
}
}
}
Then start Datasette with --metadata:
datasette path/to/your.db --metadata metadata.json -p 8001
- Use the panel
- Enter a natural‑language prompt in the panel.
- Click "Generate Only" to request SQL from the LLM and preview it in the panel. If the SQL editor is visible, the latest SQL is also inserted there.
- Click "Generate & Run" to generate (if needed) and immediately execute the SQL.
- Each assistant SQL card includes icon buttons:
- Copy (clipboard icon): copies the SQL to your clipboard.
- Run (triangle icon): runs that specific SQL—either by inserting it into the editor and submitting, or by redirecting to the query page if the editor is hidden.
Notes:
- The backend enforces that generated SQL is read‑only. See
datasette_llm_sql_writer/generator.py:is_select_only(). - The panel tracks whether the current prompt/SQL has already been run—if you change the prompt or the SQL in the editor, "Generate & Run" will regenerate before running.
CAUTION CAUTION CAUTION - THIS GRANTS OTHERS USE OF YOUR LLM ACCOUNT
This plugin uses your API key to generate SQL queries. If you're using this plugin locally, that's exactly what you want. If you're using this plugin to share a dataset with others, be aware that you're letting them run up LLM bills on your behalf.
(In practice, these prompts will be effectively free, but it's still a good idea to be aware of this. And somebody is likely to be able to find some extra use for free API calls if they work hard enough. )
Authentication and API Keys
This plugin relies on the llm package for model execution and authentication. To minimize friction and keep secrets management simple, you have two supported options:
- Use
llm's built-in key store (best for local development)
llm keys set openai
This stores your key securely for your user account. The plugin will just work if llm can access the model you specify.
- Provide an environment variable (best for containers/CI)
- Default env var:
OPENAI_API_KEY - You can change which env var to read via
metadata.jsonunder this plugin's config:
{
"plugins": {
"datasette-llm-sql-writer": {
"model": "anthropic/claude-sonnet-4-0",
"env_api_key_var": "ANTHROPIC_API_KEY"
}
}
}
- No configuration file is needed to use the default model,
gpt-5-mini.
Quick check:
export OPENAI_API_KEY="sk-..."
datasette path/to/your.db -p 8001
If you see authentication errors, visit the diagnostics section of this page.
Model selection
The model id is resolved with this precedence:
metadata.jsonplugin config:plugins.datasette-llm-sql-writer.model- Environment variable:
LLM_SQL_WRITER_MODEL - Default:
gpt-5-mini(OpenAI)
Examples:
{
"plugins": {
"datasette-llm-sql-writer": {
"model": "gpt-5-mini"
}
}
}
or
export LLM_SQL_WRITER_MODEL=gpt-5-mini
Configuration
- You need the
llmpackage configured with an API key for your chosen provider. Install the model/provider you want to use (e.g., OpenAI) per thellmdocs. - The default model id can be set in
metadata.jsonas shown above; otherwise the plugin uses a placeholder and will error if the model does not exist in yourllmsetup.
Front‑end state and persistence
The panel persists small UI state and chat history in localStorage, scoped per database:
- State key:
llm_sql_writer:state:v1:db:{db}panelCollapsed(bool): whether the panel is collapsed.lastPrompt(str): last prompt used to generate SQL.lastSql(str): last SQL returned by the LLM.lastRanSql(str): last SQL that was executed.
- History key:
llm_sql_writer:history:v1:db:{db}- The chat history associated with that database (trimmed to a bounded length).
- Changes propagate across tabs via the
storageevent.
No secrets are stored in localStorage—API keys should be configured with the llm package on the server side.
Development
To set up this plugin locally, clone the repo and use uv to manage the environment and dependencies:
git clone https://github.com/etjones/datasette-llm-sql-writer
cd datasette-llm-sql-writer
uv venv
source .venv/bin/activate
uv sync --all-extras # installs runtime and test dependencies from pyproject.toml
Run the tests:
pytest -q
Launch Datasette against an example DB to try the panel:
datasette path/to/your.db -p 8001
# Visit http://127.0.0.1:8001/{db}/{table}
How it works
- Backend route:
/-/llm-sql-writer/generateaccepts{db, table, prompt, history}and returns{sql}. - The LLM prompt includes the schema context (table/column names) and optional chat history.
- Returned SQL is validated to be read‑only before being emitted to the UI.
- On table pages the panel uses the Datasette 1.x JavaScript plugin panel API; on non‑table pages it is inserted under the main header.
Diagnostics
Visit http://localhost:8001/-/llm-sql-writer/diagnostics to quickly check whether:
llmis installed- The resolved
model_idis available - Your environment variable for the API key is present
The endpoint returns JSON with helpful hints and links to docs.
FAQ
-
Can I put my API key in
metadata.json?Nope. Keep secrets in
llm's key store or environment variables. -
I get an error about unknown or unavailable model
Install the appropriate
llmprovider plugin and confirm the model/alias exists. For Claude:uv add llm-anthropic llm keys set anthropic
See llm docs for more info on provider plugins.
-
I set a key but it still fails
Ensure Datasette is running as the same user and virtual environment where you configured
llm. If you use an env var, make sure it is set in the same process environment as the Datasette server. -
Local usage notes and privacy
This plugin is commonly used locally. Be mindful that prompts and schema context are sent to the model provider. Use provider-side controls and review your data handling policies if you work with sensitive data.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file datasette_llm_sql_writer-0.3.0.tar.gz.
File metadata
- Download URL: datasette_llm_sql_writer-0.3.0.tar.gz
- Upload date:
- Size: 22.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
75741a8947c1559d72a9a9ffec74b491b4ffab4d2819830705c69e30b6f580bf
|
|
| MD5 |
923774731b68012c803b7c24e5abe66d
|
|
| BLAKE2b-256 |
d4dbaf57ad16bc09f5ebe31d539682a0c15b1d6550832e37ece157f0d3b5688a
|
Provenance
The following attestation bundles were made for datasette_llm_sql_writer-0.3.0.tar.gz:
Publisher:
publish.yml on etjones/datasette-llm-sql-writer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datasette_llm_sql_writer-0.3.0.tar.gz -
Subject digest:
75741a8947c1559d72a9a9ffec74b491b4ffab4d2819830705c69e30b6f580bf - Sigstore transparency entry: 522543382
- Sigstore integration time:
-
Permalink:
etjones/datasette-llm-sql-writer@acf77ea365c11ff94692d3f4214a32d5b62ef309 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/etjones
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@acf77ea365c11ff94692d3f4214a32d5b62ef309 -
Trigger Event:
release
-
Statement type:
File details
Details for the file datasette_llm_sql_writer-0.3.0-py3-none-any.whl.
File metadata
- Download URL: datasette_llm_sql_writer-0.3.0-py3-none-any.whl
- Upload date:
- Size: 19.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
08bcd90b472728d7e24420b9e67c9ee4880fd49c3e0860c9a716d284572ae322
|
|
| MD5 |
04cb379113ee2ce7473fc2ce7f38eb2b
|
|
| BLAKE2b-256 |
0610aa6bf3ac8f71db74562e6dbb71a857f88ccfb462ea8de3898fe14c9d9f96
|
Provenance
The following attestation bundles were made for datasette_llm_sql_writer-0.3.0-py3-none-any.whl:
Publisher:
publish.yml on etjones/datasette-llm-sql-writer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datasette_llm_sql_writer-0.3.0-py3-none-any.whl -
Subject digest:
08bcd90b472728d7e24420b9e67c9ee4880fd49c3e0860c9a716d284572ae322 - Sigstore transparency entry: 522543388
- Sigstore integration time:
-
Permalink:
etjones/datasette-llm-sql-writer@acf77ea365c11ff94692d3f4214a32d5b62ef309 -
Branch / Tag:
refs/tags/v0.3.0 - Owner: https://github.com/etjones
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@acf77ea365c11ff94692d3f4214a32d5b62ef309 -
Trigger Event:
release
-
Statement type: