Skip to main content

API and UI for bulk loading data into Datasette from a URL

Project description

datasette-load

PyPI Changelog Tests License

API and UI for bulk loading data into Datasette from a URL

Installation

Install this plugin in the same environment as Datasette.

datasette install datasette-load

Configuration

This plugin does not require configuration - by default it downloads files to the system temp directory and swaps them into the current working directory once they have been verified as valid SQLite.

The plugin provides two optional settings to control which directories are used here:

plugins:
  datasette-load:
    staging_directory: /tmp
    database_directory: /home/location

staging_directory is used for the initial download. Files will be deleted from here if the download fails.

If the download succeeds (and the database integrity check passes) the file will be moved into the database_directory folder. This defaults to the directory in which the Datasette application was started if you do not otherwise configure it.

To enable WAL mode on the database once it has been saved to the database_directory include the enable_wal: true option:

plugins:
  datasette-load:
    database_directory: /home/location
    enable_wal: true

Usage

Users and API tokens with the datasette-load permission can visit /-/load where they can provide a URL to a SQLite database file and the name it should use within Datasette to trigger a download of that SQLite database.

You can assign that permission to the root user by starting Datasette like this:

datasette -s permissions.datasette-load.id root --root

Or with the following configuration in the datasette -c datasette.yaml file:

permissions:
  datasette-load:
    id: root

API tokens with that permission can use this API:

POST /-/load
{"url": "https://s3.amazonaws.com/til.simonwillison.net/tils.db", "name": "tils"}

You can optionally include additional HTTP headers to be used when fetching the URL:

POST /-/load
{
  "url": "https://example.com/db.sqlite",
  "name": "db",
  "headers": {"Authorization": "Bearer XXX"}
}

This tells Datasette to download the SQLite database from the given URL and use it to create (or replace) the /tils database in the Datasette instance.

That API endpoint returns:

{
  "id": "1D2A2328-199E-4D4D-AF3B-967131ADB795",
  "url": "https://s3.amazonaws.com/til.simonwillison.net/tils.db",
  "name": "tils",
  "done": false,
  "error": null,
  "todo_bytes": 20250624,
  "done_bytes": 0,
  "status_url": "https://blah.datasette/-/load/status/1D2A2328-199E-4D4D-AF3B-967131ADB795"
}

The status_url can be polled for completion. It will return the same JSON format.

When the download has finished the API will return "done": true and either "error": null if it worked or "error": "error description" if something went wrong.

Zip support

The URL can point to either a SQLite database file or a zip file containing a SQLite database - if a zip file is provided, the largest file in the archive will be extracted and used (after verifying it is a valid SQLite database). For security, the plugin will reject zip files where the largest file would extract to more than 5x the size of the zip file itself.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-load
python -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

python -m pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datasette_load-0.1a5.tar.gz (15.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datasette_load-0.1a5-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file datasette_load-0.1a5.tar.gz.

File metadata

  • Download URL: datasette_load-0.1a5.tar.gz
  • Upload date:
  • Size: 15.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for datasette_load-0.1a5.tar.gz
Algorithm Hash digest
SHA256 dd2ef6d5759d00f8a3837f5cb65f03b3d90b20e7a3696fe6720f7ea9fe47ac9f
MD5 e1c0132bf6143076b36edbbe80e416e8
BLAKE2b-256 58ec45793d9791aadbe9e70799b0e0908520418888aaed4f25af2f1e2cb462d9

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_load-0.1a5.tar.gz:

Publisher: publish.yml on datasette/datasette-load

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file datasette_load-0.1a5-py3-none-any.whl.

File metadata

  • Download URL: datasette_load-0.1a5-py3-none-any.whl
  • Upload date:
  • Size: 12.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for datasette_load-0.1a5-py3-none-any.whl
Algorithm Hash digest
SHA256 224019e8e9655f567039bb29d6328529e80189bdb2281714b8fac3566cbe3ea4
MD5 77a06fdec395f48a780c8c3aecdb9051
BLAKE2b-256 f30fc8459658069d2d196c5e8305ae6551280970c438fc03468c0e84e649c72a

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_load-0.1a5-py3-none-any.whl:

Publisher: publish.yml on datasette/datasette-load

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page