Package for ingesting Apache Atlas metadata into Google Cloud Data Catalog
Project description
google-datacatalog-apache-atlas-connector
Package for ingesting Apache Atlas metadata into Google Cloud Data Catalog, currently supporting below asset types:
- Entity Types -> Each Entity Types is converted to a Data Catalog Template with their attribute metadata
- ClassificationDefs -> Each ClassificationDef is converted to a Data Catalog Template
- EntityDefs -> Each Entity is converted to a Data Catalog Entry
Entity attributes are converted to Data Catalog Tags, in case there are Table and Columns relashionships, Columns will be converted to Data Catalog Table schema.
Since even Columns are represented as Apache Atlas Entities, this connector, allows users to specify the Entity Types list to be considered in the ingestion process. If you don't want any type to be created as Data Catalog Entries, use the Entity Types list arg to provide only the types the connector should sync.
At this time Data Catalog does not support Lineage, so this connector does not use the Lineage information. We might consider updating this if things change.
Disclaimer: This is not an officially supported Google product.
Table of Contents
- 1. Installation
- 2. Environment setup
- 3. Sample Sync application entry point
- 4. Sample Sync Hook application entry point
- 5. Developer environment
- 6. Metrics
- 7. Assumptions
- 8. Troubleshooting
1. Installation
Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.
With virtualenv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies. Make sure you use Python 3.7+.
1.1. Mac/Linux
pip3 install virtualenv
virtualenv --python python3.7 <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-datacatalog-apache-atlas-connector
1.2. Windows
pip3 install virtualenv
virtualenv --python python3.7 <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-datacatalog-apache-atlas-connector
1.3. Install from source
1.3.1. Get the code
git clone https://github.com/GoogleCloudPlatform/datacatalog-connectors-hive.git
cd datacatalog-connectors-hive/google-datacatalog-apache-atlas-connector
1.3.2. Create and activate a virtualenv
pip3 install virtualenv
virtualenv --python python3.7 <your-env>
source <your-env>/bin/activate
2. Environment setup
2.1. Auth credentials
2.1.1. Create a service account and grant it below roles
- Data Catalog Admin
2.1.2. Download a JSON key and save it as
<YOUR-CREDENTIALS_FILES_FOLDER>/apache-atlas2dc-credentials.json
2.2. Set environment variables
export GOOGLE_APPLICATION_CREDENTIALS=datacatalog_credentials_file
export APACHE_ATLAS2DC_DATACATALOG_PROJECT_ID=google_cloud_project_id
export APACHE_ATLAS2DC_HOST=localhost
export APACHE_ATLAS2DC_PORT=21000
export APACHE_ATLAS2DC_USER=my-user
export APACHE_ATLAS2DC_PASS=my-pass
3. Sample Sync application entry point
3.1. Run the google-datacatalog-apache-atlas-connector sync script
Executes full scrape process in Apache Atlas and sync Data Catalog metadata creating/updating/deleting Entries and Tags.
- Virtualenv
google-datacatalog-apache-atlas-connector sync \
--datacatalog-project-id $APACHE_ATLAS2DC_DATACATALOG_PROJECT_ID \
--atlas-host $APACHE_ATLAS2DC_HOST \
--atlas-port $APACHE_ATLAS2DC_PORT \
--atlas-user $APACHE_ATLAS2DC_USER \
--atlas-pass $APACHE_ATLAS2DC_PASS \
--atlas-entity-types DB,View,Table,hbase_table,hive_db (Optional)
3.2. Run Docker entry point
Executes incremental scrape process in Apache Atlas and sync Data Catalog metadata creating/updating/deleting Entries and Tags. This options listen to event changes on Apache Atlas event bus, which is Kafka.
docker build --rm --tag apache-atlas2datacatalog .
docker run --rm --tty -v <YOUR-CREDENTIALS_FILES_FOLDER>:/data \
apache-atlas2datacatalog sync \
--datacatalog-project-id $APACHE_ATLAS2DC_DATACATALOG_PROJECT_ID \
--atlas-host $APACHE_ATLAS2DC_HOST \
--atlas-port $APACHE_ATLAS2DC_PORT \
--atlas-user $APACHE_ATLAS2DC_USER \
--atlas-pass $APACHE_ATLAS2DC_PASS \
--atlas-entity-types DB,View,Table,hbase_table,hive_db (Optional)
4. Sample Sync Hook application entry point
4.1. Run the google-datacatalog-apache-atlas-connector event-hook script
- Virtualenv
google-datacatalog-apache-atlas-connector sync-event-hook \
--datacatalog-project-id $APACHE_ATLAS2DC_DATACATALOG_PROJECT_ID \
--atlas-host $APACHE_ATLAS2DC_HOST \
--atlas-port $APACHE_ATLAS2DC_PORT \
--atlas-user $APACHE_ATLAS2DC_USER \
--atlas-pass $APACHE_ATLAS2DC_PASS \
--event-servers my-event-server \
--event-consumer-group-id atlas-event-sync \
--atlas-entity-types DB,View,Table,hbase_table,hive_db (Optional)
4.2. Run Docker entry point
docker build --rm --tag apache-atlas2datacatalog .
docker run --rm --tty -v <YOUR-CREDENTIALS_FILES_FOLDER>:/data \
apache-atlas2datacatalog sync-event-hook \
--datacatalog-project-id $APACHE_ATLAS2DC_DATACATALOG_PROJECT_ID \
--atlas-host $APACHE_ATLAS2DC_HOST \
--atlas-port $APACHE_ATLAS2DC_PORT \
--atlas-user $APACHE_ATLAS2DC_USER \
--atlas-pass $APACHE_ATLAS2DC_PASS \
--event-servers my-event-server \
--event-consumer-group-id atlas-event-sync \
--atlas-entity-types DB,View,Table,hbase_table,hive_db (Optional)
5. Developer environment
5.1. Install and run Yapf formatter
pip install --upgrade yapf
# Auto update files
yapf --in-place --recursive src tests
# Show diff
yapf --diff --recursive src tests
# Set up pre-commit hook
# From the root of your git project.
curl -o pre-commit.sh https://raw.githubusercontent.com/google/yapf/master/plugins/pre-commit.sh
chmod a+x pre-commit.sh
mv pre-commit.sh .git/hooks/pre-commit
5.2. Install and run Flake8 linter
pip install --upgrade flake8
flake8 src tests
5.3. Run Tests
python setup.py test
6. Metrics
7. Assumptions
The connector uses simple authentication with user/pass provided credentials. And to provide live sync, the connector has an option to connect to the Apache Atlas backed Kafka instance, and listen to metadata event changes. It connects directly to Kafka's topic, so make sure it is executed in a secure network.
For stronger security, consider using Kerberos for authentication and Apache Ranger for authorization: apache-atlas-security. If you have this kind of usage, please open a feature request.
8. Troubleshooting
In the case a connector execution hits Data Catalog quota limit, an error will be raised and logged with the following detailment, depending on the performed operation READ/WRITE/SEARCH:
status = StatusCode.RESOURCE_EXHAUSTED
details = "Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'."
debug_error_string =
"{"created":"@1587396969.506556000", "description":"Error received from peer ipv4:172.217.29.42:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Quota exceeded for quota metric 'Read requests' and limit 'Read requests per minute' of service 'datacatalog.googleapis.com' for consumer 'project_number:1111111111111'.","grpc_status":8}"
For more information on Data Catalog quota, please refer to: Data Catalog quota docs.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file google-datacatalog-apache-atlas-connector-0.6.0.tar.gz
.
File metadata
- Download URL: google-datacatalog-apache-atlas-connector-0.6.0.tar.gz
- Upload date:
- Size: 23.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.2.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c33621a937085e05f5bd69883592c5a609fa9f5026eeab7c21b15b0ebede2c6b |
|
MD5 | 353ff36716a57749d2639b5dd10f5c8b |
|
BLAKE2b-256 | f1e44de3028d06bd490a597381eb2e7f8b90e8220db93a865ef02edbc1ef72b9 |
File details
Details for the file google_datacatalog_apache_atlas_connector-0.6.0-py2.py3-none-any.whl
.
File metadata
- Download URL: google_datacatalog_apache_atlas_connector-0.6.0-py2.py3-none-any.whl
- Upload date:
- Size: 36.0 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.2.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b1fd3a341c7cf2b001446a6469b27a3e1e214cdfc601796c61bc30b05259d1a8 |
|
MD5 | 597af1e6bd35a3bba87e44c579494373 |
|
BLAKE2b-256 | 49f5d14f7280eb8f15250b373da5fbdd0f5c5e2baac37b59d33a4c234afb02a5 |