Python package for replicating data across CDF tenants. Copyright 2019 Cognite AS
Project description
Cognite Python Replicator
Cognite Replicator is a Python package for replicating data across Cognite Data Fusion (CDF) projects. This package is built on top of the Cognite Python SDK.
Copyright 2019 Cognite AS
Prerequisites
In order to start using the Replicator, you need:
- Python3 (>= 3.6)
- Two API keys, one for your source tenant and one for your destination tenant. Never include the API key directly in the code or upload the key to github. Instead, set the API key as an environment variable.
This is how you set the API key as an environment variable on Mac OS and Linux:
$ export COGNITE_SOURCE_API_KEY=<your source API key>
$ export COGNITE_DESTINATION_API_KEY=<your destination API key>
Installation
The replicator is available on PyPI, and can also be executed as a standalone script.
To install it as a Python library that can be run from command line:
pip install cognite-replicator
python -m cognite.replicator
Build and run it as a docker container:
docker build -t cognite-replicator .
docker run -it cognite-replicator
For Databricks you can install it on a cluster. First, click on Libraries and Install New. Choose your library type to be PyPI, and enter cognite-replicator as Package. Let the new library install and you are ready to replicate!
Usage
Setup as Python library
import os
from cognite.client import CogniteClient
from cognite.replicator import assets, events, time_series, datapoints
SRC_API_KEY = os.environ.get("COGNITE_SOURCE_API_KEY")
DST_API_KEY = os.environ.get("COGNITE_DESTINATION_API_KEY")
PROJECT_SRC = "Name of source tenant"
PROJECT_DST = "Name of destination tenant"
CLIENT_NAME = "cognite-replicator"
BATCH_SIZE = 10000 # this is the max size of a batch to be posted
NUM_THREADS= 10 # this is the max number of threads to be used
CLIENT_SRC = CogniteClient(api_key=SRC_API_KEY, project=PROJECT_SRC, client_name=CLIENT_NAME)
CLIENT_DST = CogniteClient(api_key=DST_API_KEY, project=PROJECT_DST, client_name=CLIENT_NAME, timeout=90)
assets.replicate(CLIENT_SRC, CLIENT_DST)
events.replicate(CLIENT_SRC, CLIENT_DST, BATCH_SIZE, NUM_THREADS)
time_series.replicate(CLIENT_SRC, CLIENT_DST, BATCH_SIZE, NUM_THREADS)
datapoints.replicate(CLIENT_SRC, CLIENT_DST)
Run it from databricks notebook
import logging
from cognite.client import CogniteClient
from cognite.replicator import assets, configure_databricks_logger
SRC_API_KEY = dbutils.secrets.get("cdf-api-keys", "source-tenant")
DST_API_KEY = dbutils.secrets.get("cdf-api-keys", "destination-tenant")
CLIENT_SRC = CogniteClient(api_key=SRC_API_KEY, client_name="cognite-replicator")
CLIENT_DST = CogniteClient(api_key=DST_API_KEY, client_name="cognite-replicator")
configure_databricks_logger(log_level=logging.INFO)
assets.replicate(CLIENT_SRC, CLIENT_DST)
Changelog
Wondering about upcoming or previous changes? Take a look at the CHANGELOG.
Contributing
Want to contribute? Check out CONTRIBUTING.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for cognite_replicator-0.4.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 095f689303339dc1fa0328e32499b271eefde413d5cba22b82bd46d13d294ea6 |
|
MD5 | 158a888ad84ca5da3cc1501315c2bbb1 |
|
BLAKE2b-256 | 89d8998031ac7e332b49c9a8a01ee5df26795a9669760aaca9ad73cc23fff72f |