Import unstructured data (text and images) into structured tables
Project description
datasette-extract
Import unstructured data (text and images) into structured tables
Installation
Install this plugin in the same environment as Datasette.
datasette install datasette-extract
Configuration
This plugin uses the LLM library and works with any LLM provider that supports:
- Async models
- JSON schema-based structured output
The plugin automatically discovers available models and their required API keys. Only models with configured API keys will be shown to users.
Setting up API Keys
You can configure API keys in two ways:
Option 1: Using environment variables
Set the appropriate environment variable before starting Datasette:
# For OpenAI
export DATASETTE_SECRETS_OPENAI_API_KEY="sk-..."
# For Anthropic
export DATASETTE_SECRETS_ANTHROPIC_API_KEY="sk-ant-..."
# For Gemini
export DATASETTE_SECRETS_GEMINI_API_KEY="..."
Option 2: Using the datasette-secrets UI
The plugin integrates with datasette-secrets to let users configure their own API keys through the web interface. Any schema-capable async model will automatically have its required API key registered as a configurable secret.
Installing Model Providers
First install the LLM plugin for your chosen provider:
OpenAI (GPT-4o, GPT-4, etc.):
llm install llm-openai-plugin
Anthropic Claude:
llm install llm-anthropic
Google Gemini:
llm install llm-gemini
Other providers: See the LLM plugins directory for more options.
Starting Datasette
Once you've installed at least one LLM plugin and configured its API key, start Datasette:
DATASETTE_SECRETS_OPENAI_API_KEY="sk-..." datasette data.db --root --create
# Now click or command-click the URL containing .../-/auth-token?token=...
- The
--rootflag causes Datasette to output a link that will sign you in as root - The
--createflag will create thedata.dbSQLite database file if it does not exist
Restricting Available Models
By default, all schema-capable async models with configured API keys will be available. You can restrict this to specific models using the models setting:
plugins:
datasette-extract:
models:
- gpt-4o-mini
- claude-3-5-sonnet-latest
- gemini-2.0-flash-exp
If you only list a single model, users will not see a model selector in the UI.
Usage
This plugin provides the following features:
- In the database action cog menu for a database select "Create table with extracted data" to create a new table with data extracted from text or an image
- In the table action cog menu select "Extract data into this table" to extract data into an existing table
When creating a table you can specify the column names, types and provide an optional hint (like "YYYY-MM-DD" for dates) to influence how the data should be extracted.
When populating an existing table you can provide hints and select which columns should be populated.
Text input can be pasted directly into the textarea.
Drag and drop a PDF or text file onto the textarea to populate it with the contents of that file. PDF files will have their text extracted, but only if the file contains text as opposed to scanned images.
Drag and drop a single image onto the textarea - or select it with the image file input box - to process an image.
Permissions
Users must have the datasette-extract permission to use this tool.
In order to create tables they also need the create-table permission.
To insert rows into an existing table they need insert-row.
Development
The recommended way to develop this plugin uses uv. To run the tests:
cd datasette-extract
uv run pytest
To run a development server with an OpenAI API key (pulled from the LLM key store):
DATASETTE_SECRETS_OPENAI_API_KEY="$(llm keys get openai)" \
uv run datasette data.db --create --root --secret 1 \
-s plugins.datasette-extract.models '["gpt-4o-mini"]' \
--internal internal.db --reload
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file datasette_extract-0.2a1.tar.gz.
File metadata
- Download URL: datasette_extract-0.2a1.tar.gz
- Upload date:
- Size: 360.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b79e5640e5bcbc19e40a464889a8d8586c7456089e4dedde55dd45c9fad081af
|
|
| MD5 |
cd5bf3293aa5696cf968936f9c30d4c2
|
|
| BLAKE2b-256 |
bbabc3b8ac5dd38cc611c935b149fd04031411d3d9d0ca86afc5d440302ef3d0
|
Provenance
The following attestation bundles were made for datasette_extract-0.2a1.tar.gz:
Publisher:
publish.yml on datasette/datasette-extract
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datasette_extract-0.2a1.tar.gz -
Subject digest:
b79e5640e5bcbc19e40a464889a8d8586c7456089e4dedde55dd45c9fad081af - Sigstore transparency entry: 737674454
- Sigstore integration time:
-
Permalink:
datasette/datasette-extract@bf7cb3b8cbb3dffb1bee29ae97664d531b5634a7 -
Branch / Tag:
refs/tags/0.2a1 - Owner: https://github.com/datasette
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@bf7cb3b8cbb3dffb1bee29ae97664d531b5634a7 -
Trigger Event:
release
-
Statement type:
File details
Details for the file datasette_extract-0.2a1-py3-none-any.whl.
File metadata
- Download URL: datasette_extract-0.2a1-py3-none-any.whl
- Upload date:
- Size: 363.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dca0011e82f07794b20f6d62277de389fd6c607a7242eef9fe536eebe2a59cef
|
|
| MD5 |
846f193875db93fcf98e98b7dea32f9c
|
|
| BLAKE2b-256 |
480813bb02136f486a723ce89215233fd5d2dfff3a7e60f25e2eefd1a2cb3b9c
|
Provenance
The following attestation bundles were made for datasette_extract-0.2a1-py3-none-any.whl:
Publisher:
publish.yml on datasette/datasette-extract
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
datasette_extract-0.2a1-py3-none-any.whl -
Subject digest:
dca0011e82f07794b20f6d62277de389fd6c607a7242eef9fe536eebe2a59cef - Sigstore transparency entry: 737674467
- Sigstore integration time:
-
Permalink:
datasette/datasette-extract@bf7cb3b8cbb3dffb1bee29ae97664d531b5634a7 -
Branch / Tag:
refs/tags/0.2a1 - Owner: https://github.com/datasette
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@bf7cb3b8cbb3dffb1bee29ae97664d531b5634a7 -
Trigger Event:
release
-
Statement type: