Skip to main content

Import unstructured data (text and images) into structured tables

Project description

datasette-extract

PyPI Changelog Tests License

Import unstructured data (text and images) into structured tables

Installation

Install this plugin in the same environment as Datasette.

datasette install datasette-extract

Configuration

This plugin uses the LLM library and works with any LLM provider that supports:

  • Async models
  • JSON schema-based structured output

The plugin automatically discovers available models and their required API keys. Only models with configured API keys will be shown to users.

Setting up API Keys

You can configure API keys in two ways:

Option 1: Using environment variables

Set the appropriate environment variable before starting Datasette:

# For OpenAI
export DATASETTE_SECRETS_OPENAI_API_KEY="sk-..."

# For Anthropic
export DATASETTE_SECRETS_ANTHROPIC_API_KEY="sk-ant-..."

# For Gemini
export DATASETTE_SECRETS_GEMINI_API_KEY="..."

Option 2: Using the datasette-secrets UI

The plugin integrates with datasette-secrets to let users configure their own API keys through the web interface. Any schema-capable async model will automatically have its required API key registered as a configurable secret.

Installing Model Providers

First install the LLM plugin for your chosen provider:

OpenAI (GPT-4o, GPT-4, etc.):

llm install llm-openai-plugin

Anthropic Claude:

llm install llm-anthropic

Google Gemini:

llm install llm-gemini

Other providers: See the LLM plugins directory for more options.

Starting Datasette

Once you've installed at least one LLM plugin and configured its API key, start Datasette:

DATASETTE_SECRETS_OPENAI_API_KEY="sk-..." datasette data.db --root --create
# Now click or command-click the URL containing .../-/auth-token?token=...
  • The --root flag causes Datasette to output a link that will sign you in as root
  • The --create flag will create the data.db SQLite database file if it does not exist

Restricting Available Models

By default, all schema-capable async models with configured API keys will be available. You can restrict this to specific models using the models setting:

plugins:
  datasette-extract:
    models:
      - gpt-4o-mini
      - claude-3-5-sonnet-latest
      - gemini-2.0-flash-exp

If you only list a single model, users will not see a model selector in the UI.

Usage

This plugin provides the following features:

  • In the database action cog menu for a database select "Create table with extracted data" to create a new table with data extracted from text or an image
  • In the table action cog menu select "Extract data into this table" to extract data into an existing table

When creating a table you can specify the column names, types and provide an optional hint (like "YYYY-MM-DD" for dates) to influence how the data should be extracted.

When populating an existing table you can provide hints and select which columns should be populated.

Text input can be pasted directly into the textarea.

Drag and drop a PDF or text file onto the textarea to populate it with the contents of that file. PDF files will have their text extracted, but only if the file contains text as opposed to scanned images.

Drag and drop a single image onto the textarea - or select it with the image file input box - to process an image.

Permissions

Users must have the datasette-extract permission to use this tool.

In order to create tables they also need the create-table permission.

To insert rows into an existing table they need insert-row.

Development

The recommended way to develop this plugin uses uv. To run the tests:

cd datasette-extract
uv run pytest

To run a development server with an OpenAI API key (pulled from the LLM key store):

DATASETTE_SECRETS_OPENAI_API_KEY="$(llm keys get openai)" \
  uv run datasette data.db --create --root --secret 1 \
  -s plugins.datasette-extract.models '["gpt-4o-mini"]' \
  --internal internal.db --reload

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datasette_extract-0.2a1.tar.gz (360.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datasette_extract-0.2a1-py3-none-any.whl (363.1 kB view details)

Uploaded Python 3

File details

Details for the file datasette_extract-0.2a1.tar.gz.

File metadata

  • Download URL: datasette_extract-0.2a1.tar.gz
  • Upload date:
  • Size: 360.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for datasette_extract-0.2a1.tar.gz
Algorithm Hash digest
SHA256 b79e5640e5bcbc19e40a464889a8d8586c7456089e4dedde55dd45c9fad081af
MD5 cd5bf3293aa5696cf968936f9c30d4c2
BLAKE2b-256 bbabc3b8ac5dd38cc611c935b149fd04031411d3d9d0ca86afc5d440302ef3d0

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_extract-0.2a1.tar.gz:

Publisher: publish.yml on datasette/datasette-extract

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file datasette_extract-0.2a1-py3-none-any.whl.

File metadata

File hashes

Hashes for datasette_extract-0.2a1-py3-none-any.whl
Algorithm Hash digest
SHA256 dca0011e82f07794b20f6d62277de389fd6c607a7242eef9fe536eebe2a59cef
MD5 846f193875db93fcf98e98b7dea32f9c
BLAKE2b-256 480813bb02136f486a723ce89215233fd5d2dfff3a7e60f25e2eefd1a2cb3b9c

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_extract-0.2a1-py3-none-any.whl:

Publisher: publish.yml on datasette/datasette-extract

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page