Export Airtable data to files on disk
Project description
airtable-export
Export Airtable data to files on disk
Installation
Install this tool using pip
:
$ pip install airtable-export
Usage
You will need to know the following information:
- Your Airtable base ID - this is a string starting with
app...
- Your Airtable API key - this is a string starting with
key...
- The names of each of the tables that you wish to export
You can export all of your data to a folder called export/
by running the following:
airtable-export export base_id table1 table2 --key=key
This example would create two files: export/table1.yml
and export/table2.yml
.
Rather than passing the API key using the --key
option you can set it as an environment variable called AIRTABLE_KEY
.
Export options
By default the tool exports your data as YAML.
You can also export as JSON or as newline delimited JSON using the --json
or --ndjson
options:
airtable-export export base_id table1 table2 --key=key --ndjson
You can pass multiple format options at once. This command will create a .json
, .yml
and .ndjson
file for each exported table:
airtable-export export base_id table1 table2 \
--key=key --ndjson --yaml --json
SQLite database export
You can export tables to a SQLite database file using the --sqlite database.db
option:
airtable-export export base_id table1 table2 \
--key=key --sqlite database.db
This can be combined with other format options. If you only specify --sqlite
the export directory argument will be ignored.
The SQLite database will have a table created for each table you export. Those tables will have a primary key column called airtable_id
.
If you run this command against an existing SQLite database records with matching primary keys will be over-written by new records from the export.
Request options
By default the tool uses python-httpx's default configurations.
You can override the user-agent
using the --user-agent
option:
airtable-export export base_id table1 table2 --key=key --user-agent "Airtable Export Robot"
You can override the timeout during a network read operation using the --http-read-timeout
option. If not set, this defaults to 5s.
airtable-export export base_id table1 table2 --key=key --http-read-timeout 60
Running this using GitHub Actions
GitHub Actions is GitHub's workflow automation product. You can use it to run airtable-export
in order to back up your Airtable data to a GitHub repository. Doing this gives you a visible commit history of changes you make to your Airtable data - like this one.
To run this for your own Airtable database you'll first need to add the following secrets to your GitHub repository:
- AIRTABLE_BASE_ID
- The base ID, a string beginning `app...`
- AIRTABLE_KEY
- Your Airtable API key
- AIRTABLE_TABLES
- A space separated list of the Airtable tables that you want to backup. If any of these contain spaces you will need to enclose them in single quotes, e.g. 'My table with spaces in the name' OtherTableWithNoSpaces
Once you have set those secrets, add the following as a file called .github/workflows/backup-airtable.yml
:
name: Backup Airtable
on:
workflow_dispatch:
schedule:
- cron: '32 0 * * *'
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Check out repo
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
- uses: actions/cache@v2
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-
restore-keys: |
${{ runner.os }}-pip-
- name: Install airtable-export
run: |
pip install airtable-export
- name: Backup Airtable to backups/
env:
AIRTABLE_BASE_ID: ${{ secrets.AIRTABLE_BASE_ID }}
AIRTABLE_KEY: ${{ secrets.AIRTABLE_KEY }}
AIRTABLE_TABLES: ${{ secrets.AIRTABLE_TABLES }}
run: |-
airtable-export backups $AIRTABLE_BASE_ID $AIRTABLE_TABLES -v
- name: Commit and push if it changed
run: |-
git config user.name "Automated"
git config user.email "actions@users.noreply.github.com"
git add -A
timestamp=$(date -u)
git commit -m "Latest data: ${timestamp}" || exit 0
git push
This will run once a day (at 32 minutes past midnight UTC) and will also run if you manually click the "Run workflow" button, see GitHub Actions: Manual triggers with workflow_dispatch.
Development
To contribute to this tool, first checkout the code. Then create a new virtual environment:
cd airtable-export
python -mvenv venv
source venv/bin/activate
Or if you are using pipenv
:
pipenv shell
Now install the dependencies and tests:
pip install -e '.[test]'
To run the tests:
pytest
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file airtable-export-0.7.1.tar.gz
.
File metadata
- Download URL: airtable-export-0.7.1.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1dd3e6434d97c86eac9bd1c95b33ee8c29b1f58c1f1684a4f9ca541533b9c4c1 |
|
MD5 | 3d692601e7abc046231eee107de6b562 |
|
BLAKE2b-256 | d44f897f00a5cc50baccd793027554b5e5b094109e8e69167da22a13316fa34a |
File details
Details for the file airtable_export-0.7.1-py3-none-any.whl
.
File metadata
- Download URL: airtable_export-0.7.1-py3-none-any.whl
- Upload date:
- Size: 9.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 803f2578c6c689ab07c758d7d4599f77e28630037bc7318471ba688565c347db |
|
MD5 | 5bc5e1cb712e9c1c6aa364cf4d689711 |
|
BLAKE2b-256 | 945e014144e1f70bd7e6d72c179b3fadb3f5e2bda10393eb175bc3e54b036e5d |