A package to manage Google Cloud Data Catalog helper commands and scripts
Project description
Datacatalog Util
A Python package to manage Google Cloud Data Catalog helper commands and scripts.
Disclaimer: This is not an officially supported Google product.
Commands List
Group | Command | Description | Documentation Link | Code Repo |
---|---|---|---|---|
tags |
create | Load Tags from CSV file. | GO | GO |
tags |
delete | Delete Tags from CSV file. | GO | GO |
tags |
export | Export Tags to CSV file. | GO | GO |
tag-templates |
create | Load Templates from CSV file. | GO | GO |
tag-templates |
delete | Delete Templates from CSV file. | GO | GO |
tag-templates |
export | Export Templates to CSV file. | GO | GO |
filesets |
create | Create GCS filesets from CSV file. | GO | GO |
filesets |
enrich | Enrich GCS filesets with Tags. | GO | GO |
filesets |
clean-up-templates-and-tags | Cleans up the Fileset Template and their Tags. | GO | GO |
filesets |
delete | Delete GCS filesets from CSV file. | GO | GO |
filesets |
export | Export Filesets to CSV file. | GO | GO |
object-storage |
create-entries | Create Entries for each Object Storage File. | GO | GO |
object-storage |
delete-entries | Delete Entries that belong to the Object Storage Files. | GO | GO |
Execute Tutorial in Cloud Shell
Table of Contents
- 0. Executing in Cloud Shell from PyPi
- 1. Environment setup for local build
- 2. Load Tags from CSV file
- 3. Export Tags to CSV file
- 4. Load Templates from CSV file
- 5. Export Templates to CSV file
- 6. Filesets Commands
- 7. Export Filesets to CSV file
- 8. DataCatalog Object Storage commands
- 9. Data Catalog Templates Examples
0. Executing in Cloud Shell from PyPi
If you want to execute this script directly in cloud shell, download it from PyPi:
# Set your SERVICE ACCOUNT, for instructions go to 1.3. Auth credentials
# This name is just a suggestion, feel free to name it following your naming conventions
export GOOGLE_APPLICATION_CREDENTIALS=~/credentials/datacatalog-util-sa.json
# Install datacatalog-util
pip3 install --upgrade datacatalog-util --user
# Add to your PATH
export PATH=~/.local/bin:$PATH
# Look for available commands
datacatalog-util --help
1. Environment setup for local build
1.1. Python + virtualenv
Using virtualenv is optional, but strongly recommended unless you use Docker.
1.1.1. Install Python 3.6+
1.1.2. Get the source code
git clone https://github.com/mesmacosta/datacatalog-util
cd ./datacatalog-util
All paths starting with ./
in the next steps are relative to the datacatalog-util
folder.
1.1.3. Create and activate an isolated Python environment
pip install --upgrade virtualenv
python3 -m virtualenv --python python3 env
source ./env/bin/activate
1.1.4. Install the package
pip install --upgrade .
1.2. Docker
Docker may be used as an alternative to run the script. In this case, please disregard the Virtualenv setup instructions.
1.3. Auth credentials
1.3.1. Create a service account and grant it below roles
- Data Catalog Admin
- Storage Admin
1.3.2. Download a JSON key and save it as
This name is just a suggestion, feel free to name it following your naming conventions
./credentials/datacatalog-util-sa.json
1.3.3. Set the environment variables
This step may be skipped if you're using Docker.
export GOOGLE_APPLICATION_CREDENTIALS=~/credentials/datacatalog-util-sa.json
2. Load Tags from CSV file
2.1. Create a CSV file representing the Tags to be created
Tags are composed of as many lines as required to represent all of their fields. The columns are described as follows:
Column | Description | Mandatory |
---|---|---|
linked_resource | Full name of the asset the Entry refers to. | Y |
template_name | Resource name of the Tag Template for the Tag. | Y |
column | Attach Tags to a column belonging to the Entry schema. | N |
field_id | Id of the Tag field. | Y |
field_value | Value of the Tag field. | Y |
TIPS
- sample-input/create-tags for reference;
- Data Catalog Sample Tags (Google Sheets) may help to create/export the CSV.
2.1.1 Execute Tutorial in Cloud Shell
2.2. Run the datacatalog-util script - Create the Tags
- Python + virtualenv
datacatalog-util tags create --csv-file CSV_FILE_PATH
- Docker
docker build --rm --tag datacatalog-util .
docker run --rm --tty \
--volume CREDENTIALS_FILE_FOLDER:/credentials --volume CSV_FILE_FOLDER:/data \
datacatalog-util create-tags --csv-file /data/CSV_FILE_NAME
2.3. Run the datacatalog-util script - Delete the Tags
- Python + virtualenv
datacatalog-util tags delete --csv-file CSV_FILE_PATH
3. Export Tags to CSV file
3.1. A list of CSV files, each representing one Template will be created.
One file with summary with stats about each template, will also be created on the same directory.
The columns for the summary file are described as follows:
Column | Description |
---|---|
template_name | Resource name of the Tag Template for the Tag. |
tags_count | Number of tags found from the template. |
tagged_entries_count | Number of tagged entries with the template. |
tagged_columns_count | Number of tagged columns with the template. |
tag_string_fields_count | Number of used String fields on tags of the template. |
tag_bool_fields_count | Number of used Bool fields on tags of the template. |
tag_double_fields_count | Number of used Double fields on tags of the template. |
tag_timestamp_fields_count | Number of used Timestamp fields on tags of the template. |
tag_enum_fields_count | Number of used Enum fields on tags of the template. |
The columns for each template file are described as follows:
Column | Description |
---|---|
relative_resource_name | Full resource name of the asset the Entry refers to. |
linked_resource | Full name of the asset the Entry refers to. |
template_name | Resource name of the Tag Template for the Tag. |
tag_name | Resource name of the Tag. |
column | Attach Tags to a column belonging to the Entry schema. |
field_id | Id of the Tag field. |
field_type | Type of the Tag field. |
field_value | Value of the Tag field. |
3.1.1 Execute Tutorial in Cloud Shell
3.2. Run tags export
- Python + virtualenv
datacatalog-util tags export --project-ids my-project --dir-path DIR_PATH
3.3 Run tags export filtering Tag Templates
- Python + virtualenv
datacatalog-util tags export --project-ids my-project \
--dir-path DIR_PATH \
--tag-templates-names projects/my-project/locations/us-central1/tagTemplates/my-template,\
projects/my-project/locations/us-central1/tagTemplates/my-template-2
4. Load Templates from CSV file
4.1. Create a CSV file representing the Templates to be created
Templates are composed of as many lines as required to represent all of their fields. The columns are described as follows:
Column | Description | Mandatory |
---|---|---|
template_name | Resource name of the Tag Template for the Tag. | Y |
display_name | Resource name of the Tag Template for the Tag. | Y |
field_id | Id of the Tag Template field. | Y |
field_display_name | Display name of the Tag Template field. | Y |
field_type | Type of the Tag Template field. | Y |
enum_values | Values for the Enum field. | N |
4.1.1 Execute Tutorial in Cloud Shell
4.2. Run the datacatalog-util script - Create the Tag Templates
- Python + virtualenv
datacatalog-util tag-templates create --csv-file CSV_FILE_PATH
4.3. Run the datacatalog-util script - Delete the Tag Templates
- Python + virtualenv
datacatalog-util tag-templates delete --csv-file CSV_FILE_PATH
TIPS
- sample-input/create-tag-templates for reference;
5. Export Templates to CSV file
5.1. A CSV file representing the Templates will be created
Templates are composed of as many lines as required to represent all of their fields. The columns are described as follows:
Column | Description |
---|---|
template_name | Resource name of the Tag Template for the Tag. |
display_name | Resource name of the Tag Template for the Tag. |
field_id | Id of the Tag Template field. |
field_display_name | Display name of the Tag Template field. |
field_type | Type of the Tag Template field. |
enum_values | Values for the Enum field. |
5.1.1 Execute Tutorial in Cloud Shell
5.2. Run the datacatalog-util script
- Python + virtualenv
datacatalog-util tag-templates export --project-ids my-project --file-path CSV_FILE_PATH
6. Filesets Commands
6.1. Create a CSV file representing the Entry Groups and Entries to be created
Filesets are composed of as many lines as required to represent all of their fields. The columns are described as follows:
Column | Description | Mandatory |
---|---|---|
entry_group_name | Entry Group Name. | Y |
entry_group_display_name | Entry Group Display Name. | N |
entry_group_description | Entry Group Description. | N |
entry_id | Entry ID. | Y |
entry_display_name | Entry Display Name. | Y |
entry_description | Entry Description. | N |
entry_file_patterns | Entry File Patterns. | Y |
schema_column_name | Schema column name. | N |
schema_column_type | Schema column type. | N |
schema_column_description | Schema column description. | N |
schema_column_mode | Schema column mode. | N |
Please note that the schema_column_type
is an open string field and accept anything, if you want
to use your fileset with Dataflow SQL, follow the data-types in the official docs.
6.1.1 Execute Tutorial in Cloud Shell
6.2. Create the Filesets Entry Groups and Entries
- Python + virtualenv
datacatalog-util filesets create --csv-file CSV_FILE_PATH
TIPS
-
sample-input/create-filesets for reference;
-
If you want to create filesets without schema: sample-input/create-filesets/fileset-entry-opt-1-all-metadata-no-schema.csv for reference;
6.2.1 Create the Filesets Entry Groups and Entries - with DataFlow SQL types validation
- Python + virtualenv
datacatalog-util filesets create --csv-file CSV_FILE_PATH --validate-dataflow-sql-types
6.3. Enrich GCS Filesets with Tags
Users are able to choose the Tag fields from the list provided at Tags
datacatalog-util filesets enrich --project-id my-project
6.3.1 Enrich all fileset entries using Tag Template from a different Project (Good way to reuse the same Template)
If you are using a different Project, make sure the Service Account has the following permissions on that Project or that Template:
- Data Catalog TagTemplate Creator
- Data Catalog TagTemplate User
datacatalog-util filesets \
--project-id my_project \
enrich --tag-template-name projects/my_different_project/locations/us-central1/tagTemplates/fileset_enricher_findings
6.3.2 Execute Fileset Enricher Tutorial in Cloud Shell
6.4. clean up template and tags
Cleans up the Template and Tags from the Fileset Entries, running the main command will recreate those.
datacatalog-util filesets clean-up-templates-and-tags --project-id my-project
6.5. Delete the Filesets Entry Groups and Entries
- Python + virtualenv
datacatalog-util filesets delete --csv-file CSV_FILE_PATH
7. Export Filesets to CSV file
7.1. A CSV file representing the Filesets will be created
Filesets are composed of as many lines as required to represent all of their fields. The columns are described as follows:
Column | Description | Mandatory |
---|---|---|
entry_group_name | Entry Group Name. | Y |
entry_group_display_name | Entry Group Display Name. | Y |
entry_group_description | Entry Group Description. | Y |
entry_id | Entry ID. | Y |
entry_display_name | Entry Display Name. | Y |
entry_description | Entry Description. | Y |
entry_file_patterns | Entry File Patterns. | Y |
schema_column_name | Schema column name. | N |
schema_column_type | Schema column type. | N |
schema_column_description | Schema column description. | N |
schema_column_mode | Schema column mode. | N |
7.1.1 Execute Tutorial in Cloud Shell
7.2. Run the datacatalog-util script
- Python + virtualenv
datacatalog-util filesets export --project-ids my-project --file-path CSV_FILE_PATH
8. DataCatalog Object Storage commands
8.1 Execute Tutorial in Cloud Shell
8.2. Create DataCatalog entries based on object storage files
datacatalog-util \
object-storage sync-entries --type cloud_storage \
--project-id my_project \
--entry-group-name projects/my_project/locations/us-central1/entryGroups/my_entry_group \
--bucket-prefix my_bucket
8.3. Delete object storage entries on entry group
datacatalog-util \
object-storage delete-entries --type cloud_storage \
--project-id my_project \
--entry-group-name projects/my_project/locations/us-central1/entryGroups/my_entry_group
9. Data Catalog Templates Examples
History
0.1.0 (2020-04-08)
- First release on PyPI.
0.1.1 (2020-04-09)
- Update description.
0.2.0 (2020-04-09)
- Improved groups command line.
0.3.0 (2020-04-15)
- Added datacatalog-tag-template-processor library.
0.4.0 (2020-04-15)
- Added datacatalog-tag-template-exporter library.
0.5.0 (2020-04-15)
- Added datacatalog-tag-exporter library.
0.6.0 (2020-04-28)
- Added datacatalog-fileset-processor library.
0.7.0 (2020-04-29)
- Added new tag delete command.
0.8.0 (2020-04-29)
- Added new fileset export command.
0.9.0 (2020-04-29)
- Added new object storage commands.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file datacatalog-util-0.11.6.tar.gz
.
File metadata
- Download URL: datacatalog-util-0.11.6.tar.gz
- Upload date:
- Size: 281.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/39.0.1 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c1653f97fcf579e551212fe6989800973cf2c2054322d0e4bd163b1f457fbaa3 |
|
MD5 | 1b4e833c61792a442f0bcf26c69ec2fa |
|
BLAKE2b-256 | 304486c4447d57d54718c13f87350018a0ff8d90e58e6828c29e2ec2eba272eb |
File details
Details for the file datacatalog_util-0.11.6-py2.py3-none-any.whl
.
File metadata
- Download URL: datacatalog_util-0.11.6-py2.py3-none-any.whl
- Upload date:
- Size: 10.2 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/39.0.1 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2a75dae1dbd5f16b6c83e41022fcc07e10417078f9f7ec07d009f98db49b7664 |
|
MD5 | 8065a79578078d6bc861b20d869b04ed |
|
BLAKE2b-256 | 0e6481a314f5f5575e74aa84a55428085990accc30d1c81f615d4d2b9a6852d1 |