Skip to main content

Upload SQLite database files to Datasette

Project description

datasette-upload-dbs

PyPI Changelog Tests License

Upload SQLite database files to Datasette

Installation

Install this plugin in the same environment as Datasette.

datasette install datasette-upload-dbs

Configuration

This plugin requires you to configure a directory in which uploaded files will be stored.

On startup, Datasette will automatically load any SQLite files that it finds in that directory. This means it is safe to restart your server in between file uploads.

To configure the directory as /home/datasette/uploads, add this to a metadata.yml configuration file:

plugins:
  datasette-upload-dbs:
    directory: /home/datasette/uploads

Or if you are using metadata.json:

{
  "plugins": {
    "datasette-upload-dbs": {
      "directory": "/home/datasette/uploads"
    }
  }
}

You can use "." for the current folder when the server starts, or "uploads" for a folder relative to that folder. The folder will be created on startup if it does not already exist.

Then start Datasette like this:

datasette -m metadata.yml

The plugin defaults to loading all databases in the configured directory.

You can disable this by adding the following setting:

"skip_startup_scan": true

Usage

Only users with the upload-dbs permission will be able to upload files. The root user has this permission by default - other users can be granted access using permission plugins, see the Permissions documentation for details.

To start Datasette as the root user, run this:

datasette -m metadata.yml --root

And follow the link that is displayd on the console.

If a user has that permission they will see an "Upload database" link in the navigation menu.

This will take them to /-/upload-dbs where they will be able to upload database files, by selecting them or by dragging them onto the drop area.

Animated demo showing a file being dropped onto a box, then uploading and redirecting to the database page

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd datasette-upload-dbs
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datasette_upload_dbs-0.3.2.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

datasette_upload_dbs-0.3.2-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file datasette_upload_dbs-0.3.2.tar.gz.

File metadata

  • Download URL: datasette_upload_dbs-0.3.2.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for datasette_upload_dbs-0.3.2.tar.gz
Algorithm Hash digest
SHA256 d12052129b81e8dad2e027e49544b589fac2795e704f090de27b5b75bd54b1d9
MD5 ee4b5ad628ca435d2b6cd5e83eebf724
BLAKE2b-256 50ce36a5c48ce583978c2e2c8ad5818c78a4372aa3f7f899a9f67d522abf62ad

See more details on using hashes here.

File details

Details for the file datasette_upload_dbs-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for datasette_upload_dbs-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 733d298c068b81a0400d7f6f902037ac1da9a07501ff4370dc364a119aabe557
MD5 6725b75764ced978080526a3cd14d6cb
BLAKE2b-256 f10747c722bec64c5f4ca4dc0f5a20e872d427c5091adf60d32f4d985c65edd1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page