BigQuery-DatasetManager is a simple file-based CLI management tool for BigQuery Datasets.
Project description
BigQuery-DatasetManager
BigQuery-DatasetManager is a simple file-based CLI management tool for BigQuery Datasets.
Requirements
Python
CPython 2,7, 3,4, 3.5, 3.6
Installation
$ pip install BigQuery-DatasetManager
Resource representation
The resource representation of the dataset is described in YAML format.
name: billing
friendly_name: null
description: null
default_table_expiration_ms: null
location: US
labels:
foo: bar
access_entries:
- role: OWNER
entity_type: specialGroup
entity_id: projectOwners
- role: OWNER
entity_type: userByEmail
entity_id: billing-export-bigquery@system.gserviceaccount.com
- role: null
entity_type: view
entity_id:
datasetId: view
projectId: your-project-id
tableId: billing_view
See the official documentation of BigQuery Datasets for details of key names.
Key name |
Value |
Description |
||
---|---|---|---|---|
dataset_id |
str |
Dataset ID. |
||
friendly_name |
str |
Title of the dataset. |
||
description |
str |
Description of the dataset. |
||
default_table_expiration_ms |
int |
Default expiration time for tables in the dataset. |
||
location |
str |
Location in which the dataset is hosted. |
||
labels |
map |
Labels for the dataset. |
||
access_entries |
seq |
Represent grant of an access role to an entity. |
||
access_entries |
role |
str |
Role granted to the entity. One of
May also be None if the entity_type is view. |
|
entity_type |
str |
Type of entity being granted the role. One of
|
||
entity_id |
str/map |
ID of entity being granted the role. |
||
datasetId |
str |
The ID of the dataset containing this table. (Specified when entity_type is view.) |
||
projectId |
str |
The ID of the project containing this table. (Specified when entity_type is view.) |
||
tableId |
str |
The ID of the table. (Specified when entity_type is view.) |
Usage
Usage: bqdm [OPTIONS] COMMAND [ARGS]...
Options:
-c, --credential_file PATH Location of credential file for service accounts.
--debug Debug output management.
-h, --help Show this message and exit.
Commands:
apply Builds or changes datasets.
destroy Specify subcommand `plan` or `apply`.
export Export existing datasets into file in YAML format.
plan Generate and show an execution plan.
Export
Usage: bqdm export [OPTIONS]
Export existing datasets into file in YAML format.
Options:
-o, --output_dir TEXT Directory Path to output YAML files. [required]
-h, --help Show this message and exit.
Plan
Usage: bqdm plan [OPTIONS]
Generate and show an execution plan.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
--detailed_exitcode Return a detailed exit code when the command exits. When provided,
this argument changes
the exit codes and their meanings to provide
more granular information about what the
resulting plan contains:
0 = Succeeded with empty diff
1 = Error
2 = Succeeded with non-empty diff
-h, --help Show this message and exit.
Apply
Usage: bqdm apply [OPTIONS]
Builds or changes datasets.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
-h, --help Show this message and exit.
Destroy
Usage: bqdm destroy [OPTIONS] COMMAND [ARGS]...
Specify subcommand `plan` or `apply`
Options:
-h, --help Show this message and exit.
Commands:
apply Destroy managed datasets.
plan Generate and show an execution plan for datasets destruction.
Destroy plan
Usage: bqdm destroy plan [OPTIONS]
Generate and show an execution plan for datasets destruction.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
--detailed_exitcode Return a detailed exit code when the command exits. When provided,
this argument changes
the exit codes and their meanings to provide
more granular information about what the
resulting plan contains:
0 = Succeeded with empty diff
1 = Error
2 = Succeeded with non-empty diff
-h, --help Show this message and exit.
Destroy apply
Usage: bqdm destroy apply [OPTIONS]
Destroy managed datasets.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
-h, --help Show this message and exit.
Authentication
See authentication section in the official documentation of google-cloud-python.
If you’re running in Compute Engine or App Engine, authentication should “just work”.
If you’re developing locally, the easiest way to authenticate is using the Google Cloud SDK:
$ gcloud auth application-default login
Note that this command generates credentials for client libraries. To authenticate the CLI itself, use:
$ gcloud auth login
Previously, gcloud auth login was used for both use cases. If your gcloud installation does not support the new command, please update it:
$ gcloud components update
If you’re running your application elsewhere, you should download a service account JSON keyfile and point to it using an environment variable:
$ export GOOGLE_APPLICATION_CREDENTIALS="/path/to/keyfile.json"
Testing
Run test
$ py.test
Run test multiple Python versions
$ pip install tox
$ pyenv local 2.7.13 3.4.6 3.5.3 3.6.1
$ tox
TODO
Manage table resources
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for BigQuery-DatasetManager-0.0.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a75e91cbc871033ce44d4b103a3fab372fe339867ab5891d9084781feff3563 |
|
MD5 | 36b587032e60ae3bf1e4e6563cd150cc |
|
BLAKE2b-256 | 2daa852c0ee9d11f3de0063594dfeba9f74e4457d74d4801a54ad711475e607b |
Hashes for BigQuery_DatasetManager-0.0.2-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 750244179e6abff90e939b7bf18648584edecc0ad3ac081027ac4a0bb28db9bd |
|
MD5 | 15dfa49c8597eea4a5c63bcbaea6dfe5 |
|
BLAKE2b-256 | e09b38f12d0a76e7344679fac518603a3a275c6059c840cad1f51cf9c45bf00e |