Skip to main content

Python package for replicating data across CDF tenants. Copyright 2019 Cognite AS

Project description

Cognite logo

Cognite Python Replicator

build codecov Documentation Status PyPI version tox PyPI - Python Version Code style: black

Cognite Replicator is a Python package for replicating data across Cognite Data Fusion (CDF) projects. This package is built on top of the Cognite Python SDK.

Copyright 2019 Cognite AS

Prerequisites

In order to start using the Replicator, you need:

  • Python3 (>= 3.6)
  • Two API keys, one for your source tenant and one for your destination tenant. Never include the API key directly in the code or upload the key to github. Instead, set the API key as an environment variable.

This is how you set the API key as an environment variable on Mac OS and Linux:

$ export COGNITE_SOURCE_API_KEY=<your source API key>
$ export COGNITE_DESTINATION_API_KEY=<your destination API key>

Installation

The replicator is available on PyPI, and can also be executed as a standalone script.

To run it from command line, run:

pip install cognite-replicator
python -m cognite.replicator config/filepath.yml

If no file is specified then replicator will use config/default.yml.

Alternatively, build and run it as a docker container. The image is avaible on docker hub:

docker build -t cognite-replicator .
docker run -it cognite-replicator

For Databricks you can install it on a cluster. First, click on Libraries and Install New. Choose your library type to be PyPI, and enter cognite-replicator as Package. Let the new library install and you are ready to replicate!

Usage

Setup as Python library

import os

from cognite.client import CogniteClient
from cognite.replicator import assets, events, time_series, datapoints

SRC_API_KEY = os.environ.get("COGNITE_SOURCE_API_KEY")
DST_API_KEY = os.environ.get("COGNITE_DESTINATION_API_KEY")
PROJECT_SRC = "Name of source tenant"
PROJECT_DST = "Name of destination tenant"
CLIENT_NAME = "cognite-replicator"
BATCH_SIZE = 10000 # this is the max size of a batch to be posted
NUM_THREADS= 10 # this is the max number of threads to be used

CLIENT_SRC = CogniteClient(api_key=SRC_API_KEY, project=PROJECT_SRC, client_name=CLIENT_NAME)
CLIENT_DST = CogniteClient(api_key=DST_API_KEY, project=PROJECT_DST, client_name=CLIENT_NAME, timeout=90)

assets.replicate(CLIENT_SRC, CLIENT_DST)
events.replicate(CLIENT_SRC, CLIENT_DST, BATCH_SIZE, NUM_THREADS)
time_series.replicate(CLIENT_SRC, CLIENT_DST, BATCH_SIZE, NUM_THREADS)
datapoints.replicate(CLIENT_SRC, CLIENT_DST)

Run it from databricks notebook

import logging

from cognite.client import CogniteClient
from cognite.replicator import assets, configure_databricks_logger

SRC_API_KEY = dbutils.secrets.get("cdf-api-keys", "source-tenant")
DST_API_KEY = dbutils.secrets.get("cdf-api-keys", "destination-tenant")

CLIENT_SRC = CogniteClient(api_key=SRC_API_KEY, client_name="cognite-replicator")
CLIENT_DST = CogniteClient(api_key=DST_API_KEY, client_name="cognite-replicator")

configure_databricks_logger(log_level=logging.INFO)
assets.replicate(CLIENT_SRC, CLIENT_DST)

Changelog

Wondering about upcoming or previous changes? Take a look at the CHANGELOG.

Contributing

Want to contribute? Check out CONTRIBUTING.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for cognite-replicator, version 0.7.6
Filename, size File type Python version Upload date Hashes
Filename, size cognite_replicator-0.7.6-py3-none-any.whl (29.0 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size cognite_replicator-0.7.6.tar.gz (24.9 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page