Skip to main content

Package for ingesting Looker metadata into Google Cloud Data Catalog

Project description

google-datacatalog-looker-connector

Package for ingesting Looker metadata into Google Cloud Data Catalog, currently supporting below asset types:

  • Folder
  • Look
  • Dashboard
  • Dashboard Element (aka Tile)
  • Query

Python package PyPi License Issues

Disclaimer: This is not an officially supported Google product.

Table of Contents


1. Installation

Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.

With virtualenv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies. Make sure you use Python 3.7+.

1.1. Mac/Linux

pip3 install virtualenv
virtualenv --python python3.7 <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-datacatalog-looker-connector

1.2. Windows

pip3 install virtualenv
virtualenv --python python3.7 <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-datacatalog-looker-connector

1.3. Install from source

1.3.1. Get the code

git clone https://github.com/GoogleCloudPlatform/datacatalog-connectors-bi/
cd datacatalog-connectors-bi/google-datacatalog-looker-connector

1.3.2. Create and activate a virtualenv

pip3 install virtualenv
virtualenv --python python3.7 <your-env>
source <your-env>/bin/activate

1.3.3. Install the library

pip install .

2. Environment setup

2.1. Auth credentials

2.1.1. Create a GCP Service Account and grant it below roles

  • Data Catalog Admin

2.1.2. Download a JSON key and save it as

  • <YOUR-CREDENTIALS_FILES_FOLDER>/looker2dc-datacatalog-credentials.json

2.1.3. Create Looker API3 credentials

The credentials required for API access must be obtained by creating an API3 key on a user account in the Looker Admin console. The API3 key consists of a public client_id and a private client_secret.

The shortcut for Looker Admin console is https:///admin/users/api3_key/

2.1.4. Create a Looker configuration file

File content is described in Looker SDK documentation. Save the file as <YOUR-CREDENTIALS_FILES_FOLDER>/looker2dc-looker-credentials.ini

Please notice this folder and files will be required in next steps.

2.2. Set environment variables

export GOOGLE_APPLICATION_CREDENTIALS=datacatalog_credentials_file

Replace above values according to your environment. The Data Catalog credentials file was saved in step 1.2.2.

3. Run entry point

3.1. Run Python entry point

  • Virtualenv
google-datacatalog-looker-connector \
  --datacatalog-project-id <YOUR-DATACATALOG-PROJECT-ID> \
  --looker-credentials-file looker_credentials_ini_file

3.2. Run Docker entry point

docker build --rm --tag looker2datacatalog .
docker run --rm --tty -v <YOUR-CREDENTIALS_FILES_FOLDER>:/data \
  looker2datacatalog \ 
  --datacatalog-project-id <YOUR-DATACATALOG-PROJECT-ID> \
  --looker-credentials-file /data/looker2dc-looker-credentials.ini

4. Developer environment

4.1. Install and run Yapf formatter

pip install --upgrade yapf

# Auto update files
yapf --in-place --recursive src tests

# Show diff
yapf --diff --recursive src tests

# Set up pre-commit hook
# From the root of your git project.
curl -o pre-commit.sh https://raw.githubusercontent.com/google/yapf/master/plugins/pre-commit.sh
chmod a+x pre-commit.sh
mv pre-commit.sh .git/hooks/pre-commit

4.2. Install and run Flake8 linter

pip install --upgrade flake8
flake8 src tests

4.3. Run Tests

python setup.py test

5. Troubleshooting

In the case a connector execution hits Data Catalog quota limit, an error will be raised and logged with the following detailment, depending on the performed operation READ/WRITE/SEARCH:

status = StatusCode.RESOURCE_EXHAUSTED
details = "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'."
debug_error_string = 
"{"created":"@1587396969.506556000", "description":"Error received from peer ipv4:172.217.29.42:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'.","grpc_status":8}"

For more information on Data Catalog quota, please refer to: Data Catalog quota docs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

google_datacatalog_looker_connector-0.5.1-py2.py3-none-any.whl (26.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file google-datacatalog-looker-connector-0.5.1.tar.gz.

File metadata

  • Download URL: google-datacatalog-looker-connector-0.5.1.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.6

File hashes

Hashes for google-datacatalog-looker-connector-0.5.1.tar.gz
Algorithm Hash digest
SHA256 9a94d49a62936118e704c795510ad6819ee73bcf37142c05ed0a5c35479e1c2f
MD5 14e64d0a03fb10e7b825e04c65b8126f
BLAKE2b-256 9cc4ced3fb19e7252530f4b907d8c3e53563675725435fc2ade7787362abb87e

See more details on using hashes here.

File details

Details for the file google_datacatalog_looker_connector-0.5.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for google_datacatalog_looker_connector-0.5.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 edf099409ff1d401c329b096336f73ecf41b0fae1ee94646ffe99eb3909163bd
MD5 6cf58274eea122cc24aa567c38ef6c8f
BLAKE2b-256 8fab3bf202fdc0b6c9c2810b457b92ffa1fddd4c946980818b5c6d24061a29db

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page