Skip to main content

API and UI for bulk loading data into Datasette from a URL

Project description

datasette-load

PyPI Changelog Tests License

API and UI for bulk loading data into Datasette from a URL

Installation

Install this plugin in the same environment as Datasette.

datasette install datasette-load

Configuration

This plugin does not require configuration - by default it downloads files to the system temp directory and swaps them into the current working directory once they have been verified as valid SQLite.

The plugin provides two optional settings to control which directories are used here:

plugins:
  datasette-load:
    staging_directory: /tmp
    database_directory: /home/location

staging_directory is used for the initial download. Files will be deleted from here if the download fails.

If the download succeeds (and the database integrity check passes) the file will be moved into the database_directory folder. This defaults to the directory in which the Datasette application was started if you do not otherwise configure it.

To enable WAL mode on the database once it has been saved to the database_directory include the enable_wal: true option:

plugins:
  datasette-load:
    database_directory: /home/location
    enable_wal: true

Usage

Users and API tokens with the datasette-load permission can visit /-/load where they can provide a URL to a SQLite database file and the name it should use within Datasette to trigger a download of that SQLite database.

You can assign that permission to the root user by starting Datasette like this:

datasette -s permissions.datasette-load.id root --root

Or with the following configuration in the datasette -c datasette.yaml file:

permissions:
  datasette-load:
    id: root

API tokens with that permission can use this API:

POST /-/load
{"url": "https://s3.amazonaws.com/til.simonwillison.net/tils.db", "name": "tils"}

You can optionally include additional HTTP headers to be used when fetching the URL:

POST /-/load
{
  "url": "https://example.com/db.sqlite",
  "name": "db",
  "headers": {"Authorization": "Bearer XXX"}
}

This tells Datasette to download the SQLite database from the given URL and use it to create (or replace) the /tils database in the Datasette instance.

That API endpoint returns:

{
  "id": "1D2A2328-199E-4D4D-AF3B-967131ADB795",
  "url": "https://s3.amazonaws.com/til.simonwillison.net/tils.db",
  "name": "tils",
  "done": false,
  "error": null,
  "todo_bytes": 20250624,
  "done_bytes": 0,
  "status_url": "https://blah.datasette/-/load/status/1D2A2328-199E-4D4D-AF3B-967131ADB795"
}

The status_url can be polled for completion. It will return the same JSON format.

When the download has finished the API will return "done": true and either "error": null if it worked or "error": "error description" if something went wrong.

Zip support

The URL can point to either a SQLite database file or a zip file containing a SQLite database - if a zip file is provided, the largest file in the archive will be extracted and used (after verifying it is a valid SQLite database). For security, the plugin will reject zip files where the largest file would extract to more than 5x the size of the zip file itself.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-load
python -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

python -m pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datasette_load-0.1a3.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datasette_load-0.1a3-py3-none-any.whl (12.7 kB view details)

Uploaded Python 3

File details

Details for the file datasette_load-0.1a3.tar.gz.

File metadata

  • Download URL: datasette_load-0.1a3.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for datasette_load-0.1a3.tar.gz
Algorithm Hash digest
SHA256 6058047066f8e9f16e9ac6bfc27a936cfbe9c11239f0a9129116db76cb82f9e1
MD5 bff5687439ac31dfd41e71a05513712c
BLAKE2b-256 e81bf45ba2383bfaf9eb57119535680ec20821e5697f960a7f48349feb45f3e5

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_load-0.1a3.tar.gz:

Publisher: publish.yml on datasette/datasette-load

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file datasette_load-0.1a3-py3-none-any.whl.

File metadata

  • Download URL: datasette_load-0.1a3-py3-none-any.whl
  • Upload date:
  • Size: 12.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for datasette_load-0.1a3-py3-none-any.whl
Algorithm Hash digest
SHA256 cf939c546a12f49c8b59e4dd56bab5722cbd2c7ffb49d53872bd184af55c18ae
MD5 30e62d8d1b7b82a3cb064a5c51f854ae
BLAKE2b-256 db219444b93dfa8d8af053eb563911682210aedd0bb127fb787293af8ae6ec62

See more details on using hashes here.

Provenance

The following attestation bundles were made for datasette_load-0.1a3-py3-none-any.whl:

Publisher: publish.yml on datasette/datasette-load

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page