Skip to main content

Library for ingesting Oracle metadata into Google Cloud Data Catalog

Project description

google-datacatalog-oracle-connector

Library for ingesting Oracle metadata into Google Cloud Data Catalog.

Python package PyPi License Issues

Disclaimer: This is not an officially supported Google product.

Table of Contents


1. Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies. Make sure you use Python 3.6+.

1.1. Mac/Linux

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-datacatalog-oracle-connector

1.2. Windows

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-datacatalog-oracle-connector

1.3. Install from source

1.3.1. Get the code

git clone https://github.com/GoogleCloudPlatform/datacatalog-connectors-rdbms/
cd datacatalog-connectors-rdbms/google-datacatalog-oracle-connector

1.3.2. Create and activate a virtualenv

pip3 install virtualenv
virtualenv --python python3.6 <your-env>
source <your-env>/bin/activate

1.3.3. Install the library

pip install .

2. Environment setup

2.1. Auth credentials

2.1.1. Create a service account and grant it below roles

  • Data Catalog Admin

2.1.2. Download a JSON key and save it as

  • <YOUR-CREDENTIALS_FILES_FOLDER>/oracle2dc-datacatalog-credentials.json

Please notice this folder and file will be required in next steps.

2.2 Set up Oracle Driver (Optional)

This is step is needed when you are running the connector on a machine that does not have the Oracle installation.

2.2.1 Set Oracle client for Linux (Cloud Shell)

Download the zip file: https://oracle.github.io/odpi/doc/installation.html#linux

# Unzip it
unzip instantclient-basic-linux.x64-19.5.0.0.0dbru.zip
# Set Oracle library ENV Var on the unzip dir
export LD_LIBRARY_PATH=/oracle2datacatalog/bin/instantclient_19_5

2.2.2 Set Oracle client for Mac

Download the zip file: https://oracle.github.io/odpi/doc/installation.html#macos

# Unzip it
unzip instantclient-basic-macos.x64-19.3.0.0.0dbru.zip
# Set Oracle library ENV Var on the unzip dir
export LD_LIBRARY_PATH=/oracle2datacatalog/bin/instantclient_19_3

2.3. Set environment variables

Replace below values according to your environment:

export GOOGLE_APPLICATION_CREDENTIALS=data_catalog_credentials_file

export ORACLE2DC_DATACATALOG_PROJECT_ID=google_cloud_project_id
export ORACLE2DC_DATACATALOG_LOCATION_ID=google_cloud_location_id
export ORACLE2DC_ORACLE_SERVER=oracle_server
export ORACLE2DC_ORACLE_SERVER_PORT=oracle_server_port
export ORACLE2DC_ORACLE_USERNAME=oracle_username
export ORACLE2DC_ORACLE_PASSWORD=oracle_password
export ORACLE2DC_ORACLE_DATABASE_SERVICE=oracle_db_service
export ORACLE2DC_RAW_METADATA_CSV=oracle_raw_csv (If supplied ignores the Oracle server credentials)

3. Run entry point

3.1. Run Python entry point

  • Virtualenv
google-datacatalog-oracle-connector \
--datacatalog-project-id=$ORACLE2DC_DATACATALOG_PROJECT_ID \
--datacatalog-location-id=$ORACLE2DC_DATACATALOG_LOCATION_ID \
--oracle-host=$ORACLE2DC_ORACLE_SERVER \
--oracle-port=$ORACLE2DC_ORACLE_SERVER_PORT \
--oracle-user=$ORACLE2DC_ORACLE_USERNAME \
--oracle-pass=$ORACLE2DC_ORACLE_PASSWORD \
--oracle-db-service=$ORACLE2DC_ORACLE_DATABASE_SERVICE \
--raw-metadata-csv=$ORACLE2DC_RAW_METADATA_CSV

3.2. Run Docker entry point

docker build -t oracle2datacatalog .
docker run --rm --tty -v YOUR-CREDENTIALS_FILES_FOLDER:/data oracle2datacatalog \
--datacatalog-project-id=$ORACLE2DC_DATACATALOG_PROJECT_ID  \
--datacatalog-location-id=$ORACLE2DC_DATACATALOG_LOCATION_ID \
--oracle-host=$ORACLE2DC_ORACLE_SERVER \
--oracle-port=$ORACLE2DC_ORACLE_SERVER_PORT  \
--oracle-user=$ORACLE2DC_ORACLE_USERNAME \
--oracle-pass=$ORACLE2DC_ORACLE_PASSWORD \
--oracle-db-service=$ORACLE2DC_ORACLE_DATABASE_SERVICE \
--raw-metadata-csv=$ORACLE2DC_RAW_METADATA_CSV

4 Scripts inside tools

4.1. Run clean up

# List of projects split by comma. Can be a single value without comma
export ORACLE2DC_DATACATALOG_PROJECT_IDS=my-project-1,my-project-2
# Run the clean up
python tools/cleanup_datacatalog.py --datacatalog-project-ids=$ORACLE2DC_DATACATALOG_PROJECT_IDS 

5. Developer environment

5.1. Install and run Yapf formatter

pip install --upgrade yapf

# Auto update files
yapf --in-place --recursive src tests

# Show diff
yapf --diff --recursive src tests

# Set up pre-commit hook
# From the root of your git project.
curl -o pre-commit.sh https://raw.githubusercontent.com/google/yapf/master/plugins/pre-commit.sh
chmod a+x pre-commit.sh
mv pre-commit.sh .git/hooks/pre-commit

5.2. Install and run Flake8 linter

pip install --upgrade flake8
flake8 src tests

5.3. Run Tests

python setup.py test

6. Metrics

Metrics README.md

7. Troubleshooting

In the case a connector execution hits Data Catalog quota limit, an error will be raised and logged with the following detailement, depending on the performed operation READ/WRITE/SEARCH:

status = StatusCode.RESOURCE_EXHAUSTED
details = "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'."
debug_error_string = 
"{"created":"@1587396969.506556000", "description":"Error received from peer ipv4:172.217.29.42:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'.","grpc_status":8}"

For more info about Data Catalog quota, go to: Data Catalog quota docs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

google_datacatalog_oracle_connector-0.9.0-py2.py3-none-any.whl (10.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file google-datacatalog-oracle-connector-0.9.0.tar.gz.

File metadata

  • Download URL: google-datacatalog-oracle-connector-0.9.0.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for google-datacatalog-oracle-connector-0.9.0.tar.gz
Algorithm Hash digest
SHA256 6ed193133d7c0645d04d693d1bf35cf32b43cb0319ab039963dc95811a56a5b1
MD5 c3ace9a5b9760d0f16d9cf30ad083e1d
BLAKE2b-256 9000dad59b81bebe3d57077c18b8e303409d67e944e424b4894d547393d9ea44

See more details on using hashes here.

File details

Details for the file google_datacatalog_oracle_connector-0.9.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for google_datacatalog_oracle_connector-0.9.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 e78225ffec87016fd9ba4825f413d2afb08d8dbb2f9251f97cf878e37119ffe6
MD5 a36a033db03f2da3564aab75281b5ea1
BLAKE2b-256 3047abfe164ba0d5789bf9f7f60d98b85f4842d24c18bd0bbb4b50c4d5a0ce64

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page