Skip to main content

Library for ingesting RDBMS CSV metadata into Google Cloud Data Catalog

Project description

google-datacatalog-rdbmscsv-connector

Library for ingesting a CSV metadata extraction from a RDBMS into Google Cloud Data Catalog.

Python package PyPi License Issues

Disclaimer: This is not an officially supported Google product.

Table of Contents


1. Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies. Make sure you use Python 3.6+.

1.1. Mac/Linux

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-datacatalog-rdbmscsv-connector

1.2. Windows

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-datacatalog-rdbmscsv-connector

1.3. Install from source

1.3.1. Get the code

git clone https://github.com/GoogleCloudPlatform/datacatalog-connectors-rdbms/
cd datacatalog-connectors-rdbms/google-datacatalog-rdbmscsv-connector

1.3.2. Create and activate a virtualenv

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate

1.3.3. Install the library

pip install .

2. Environment setup

2.1. Auth credentials

2.1.1. Create a service account and grant it below roles

  • Data Catalog Admin

2.1.2. Download a JSON key and save it as

  • <YOUR-CREDENTIALS_FILES_FOLDER>/rdbmscsv2dc-credentials.json

Please notice this folder and file will be required in next steps.

2.2. Set environment variables

Replace below values according to your environment:

export GOOGLE_APPLICATION_CREDENTIALS=data_catalog_credentials_file

export RDBMS2DC_DATACATALOG_PROJECT_ID=google_cloud_project_id
export RDBMS2DC_DATACATALOG_LOCATION_ID=google_cloud_location_id
export RDBMS2DC_SERVER=rdbms_server
export RDBMS2DC_TYPE=oracle
export RDBMS2DC_RAW_METADATA_CSV=rdms_raw_csv

3. Run entry point

3.1. Run Python entry point

  • Virtualenv
google-datacatalog-rdbmscsv-connector \
--datacatalog-project-id=$RDBMS2DC_DATACATALOG_PROJECT_ID \
--datacatalog-location-id=$RDBMS2DC_DATACATALOG_LOCATION_ID \
--rdbms-host=$RDBMS2DC_SERVER \
--rdbms-type=$RDBMS2DC_TYPE \
--raw-metadata-csv=$RDBMS2DC_RAW_METADATA_CSV

3.2. Run Docker entry point

docker build -t rdbmscsv2datacatalog .
docker run --rm --tty -v YOUR-CREDENTIALS_FILES_FOLDER:/data rdbmscsv2datacatalog \
--datacatalog-project-id=$RDBMS2DC_DATACATALOG_PROJECT_ID  \
--datacatalog-location-id=$RDBMS2DC_DATACATALOG_LOCATION_ID \
--rdbms-host=$RDBMS2DC_SERVER \
--rdbms-type=$RDBMS2DC_TYPE \
--raw-metadata-csv=$RDBMS2DC_RAW_METADATA_CSV

4 Scripts inside tools

4.1. Run clean up

# List of projects split by comma. Can be a single value without comma
export RDBMS_DATACATALOG_PROJECT_IDS=my-project-1,my-project-2
export RDBMS2DC_TYPE=oracle
# Run the clean up
python tools/cleanup_datacatalog.py \
--datacatalog-project-ids=$RDBMS_DATACATALOG_PROJECT_IDS \
--rdbms-type=$RDBMS2DC_TYPE

5. Developer environment

5.1. Install and run Yapf formatter

pip install --upgrade yapf

# Auto update files
yapf --in-place --recursive src tests

# Show diff
yapf --diff --recursive src tests

# Set up pre-commit hook
# From the root of your git project.
curl -o pre-commit.sh https://raw.githubusercontent.com/google/yapf/master/plugins/pre-commit.sh
chmod a+x pre-commit.sh
mv pre-commit.sh .git/hooks/pre-commit

5.2. Install and run Flake8 linter

pip install --upgrade flake8
flake8 src tests

5.3. Run Tests

python setup.py test

6. Troubleshooting

In the case a connector execution hits Data Catalog quota limit, an error will be raised and logged with the following detailement, depending on the performed operation READ/WRITE/SEARCH:

status = StatusCode.RESOURCE_EXHAUSTED
details = "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'."
debug_error_string = 
"{"created":"@1587396969.506556000", "description":"Error received from peer ipv4:172.217.29.42:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'.","grpc_status":8}"

For more info about Data Catalog quota, go to: Data Catalog quota docs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

google_datacatalog_rdbmscsv_connector-0.6.0-py2.py3-none-any.whl (7.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file google-datacatalog-rdbmscsv-connector-0.6.0.tar.gz.

File metadata

  • Download URL: google-datacatalog-rdbmscsv-connector-0.6.0.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.7.6

File hashes

Hashes for google-datacatalog-rdbmscsv-connector-0.6.0.tar.gz
Algorithm Hash digest
SHA256 1af754bc5aa9b2de657d4ecdc68f36901940885042e9b145a8edd95da8155059
MD5 9b5103d83ab48a2aba5c73fd95e75875
BLAKE2b-256 9aa1c24d8c4dc1c520ef2c4e06563a7cb655f54b7b3bc3f6476ef20ce628daf9

See more details on using hashes here.

File details

Details for the file google_datacatalog_rdbmscsv_connector-0.6.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for google_datacatalog_rdbmscsv_connector-0.6.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 3a3831234c213cae616ded39a9038664d2539ff453f53c08b43a8206d3e07bb6
MD5 86b5865e56213a305ba7a500d59f2d64
BLAKE2b-256 4dd66cd1824ed440a282993daa91f652daa19931879269d3518c9e1e924d7002

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page