Skip to main content

GoodData Cloud lifecycle automation pipelines

Project description

GoodData Pipelines

A high-level library for automating the lifecycle of GoodData Cloud (GDC).

You can use the package to manage following resources in GDC:

  1. Provisioning (create, update, delete)
    • User profiles
    • User Groups
    • User/Group permissions
    • User Data Filters
    • Child workspaces (incl. Workspace Data Filter settings)
  2. [PLANNED]: Backup and restore of workspaces
  3. [PLANNED]: Custom fields management
    • extend the Logical Data Model of a child workspace

In case you are not interested in incorporating a library in your own program but would like to use a ready-made script, consider having a look at GoodData Productivity Tools.

Provisioning

The entities can be managed either in full load or incremental way.

Full load means that the input data should represent the full and complete desired state of GDC after the script has finished. For example, you would include specification of all child workspaces you want to exist in GDC in the input data for workspace provisioning. Any workspaces present in GDC and not defined in the source data (i.e., your input) will be deleted.

On the other hand, the incremental load treats the source data as instructions for a specific change, e.g., a creation or a deletion of a specific workspace. You can specify which workspaces you would want to delete or create, while the rest of the workspaces already present in GDC will remain as they are, ignored by the provisioning script.

The provisioning module exposes Provisioner classes reflecting the different entities. The typical usage would involve importing the Provisioner class and the data input data model for the class and planned provisioning method:

import os
from csv import DictReader
from pathlib import Path

# Import the Entity Provisioner class and corresponding model from gooddata_pipelines library
from gooddata_pipelines import UserFullLoad, UserProvisioner
from gooddata_pipelines.logger.logger import LogObserver

# Optionally, subscribe a standard Python logger to the LogObserver
import logging
logger = logging.getLogger(__name__)
LogObserver().subscribe(logger)

# Create the Provisioner instance - you can also create the instance from a GDC yaml profile
provisioner = UserProvisioner(
    host=os.environ["GDC_HOSTNAME"], token=os.environ["GDC_AUTH_TOKEN"]
)

# Load your data from your data source
source_data_path: Path = Path("path/to/some.csv")
source_data_reader = DictReader(source_data_path.read_text().splitlines())
source_data = [row for row in source_data_reader]

# Validate your input data with
full_load_data: list[UserFullLoad] = UserFullLoad.from_list_of_dicts(
    source_data
)
provisioner.full_load(full_load_data)

Ready-made scripts covering the basic use cases can be found here in the GoodData Productivity Tools repository

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gooddata_pipelines-1.49.1.dev2.tar.gz (125.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gooddata_pipelines-1.49.1.dev2-py3-none-any.whl (130.4 kB view details)

Uploaded Python 3

File details

Details for the file gooddata_pipelines-1.49.1.dev2.tar.gz.

File metadata

  • Download URL: gooddata_pipelines-1.49.1.dev2.tar.gz
  • Upload date:
  • Size: 125.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for gooddata_pipelines-1.49.1.dev2.tar.gz
Algorithm Hash digest
SHA256 0d86a75cb7cf10f7a6310c2ca00496204d2b33efb0ea7ce270d4091c3aef124a
MD5 b7f24728568d5f630943bac0729e5e6c
BLAKE2b-256 cf61358db239415e4165f70e35bf5066c4e12799ca0cba0ac7e5dd26b0dd7e34

See more details on using hashes here.

File details

Details for the file gooddata_pipelines-1.49.1.dev2-py3-none-any.whl.

File metadata

File hashes

Hashes for gooddata_pipelines-1.49.1.dev2-py3-none-any.whl
Algorithm Hash digest
SHA256 1ddcdeb112e6f6326446657fe1db758dbcae3d94060c279f158dc11f75c613ee
MD5 b1d5039d1d1aca181b87675ea5ff4b57
BLAKE2b-256 7e7d9f505128a7d6acd8897e4223eca54b217e432ea16559d818813b5ee1d8d4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page