Skip to main content

No project description provided

Project description

GoodData Pipelines

A high level library for automating the lifecycle of GoodData Cloud (GDC).

You can use the package to manage following resoursec in GDC:

  1. Provisioning (create, update, delete)
    • User profiles
    • User Groups
    • User/Group permissions
    • User Data Filters
    • Child workspaces (incl. Workspace Data Filter settings)
  2. [PLANNED]: Backup and restore of workspaces
  3. [PLANNED]: Custom fields management
    • extend the Logical Data Model of a child workspace

In case you are not interested in incorporating a library in your own program, but would like to use a ready-made script, consider having a look at GoodData Productivity Tools.

Provisioning

The entities can be managed either in full load or incremental way.

Full load means that the input data should represent the full and complete desired state of GDC after the script has finished. For example, you would include specification of all child workspaces you want to exist in GDC in the input data for workspace provisioning. Any workspaces present in GDC and not defined in the source data (i.e., your input) will be deleted.

On the other hand, the incremental load treats the source data as instructions for a specific change, e.g., a creation or a deletion of a specific workspace. You can specify which workspaces you would want to delete or create, while the rest of the workspaces already present in GDC will remain as they are, ignored by the provisioning script.

The provisioning module exposes Provisioner classes reflecting the different entities. The typical usage would involve importing the Provisioner class and the data input data model for the class and planned provisioning method:

import os
from csv import DictReader
from pathlib import Path

# Import the Entity Provisioner class and corresponing model from gooddata_pipelines library
from gooddata_pipelines import UserFullLoad, UserProvisioner

# Optional: you can set up logging and subscribe it to the Provisioner
from utils.logger import setup_logging

setup_logging()
logger = logging.getLogger(__name__)

# Create the Provisioner instance - you can also create the instance from a GDC yaml profile
provisioner = UserProvisioner(
    host=os.environ["GDC_HOSTNAME"], token=os.environ["GDC_AUTH_TOKEN"]
)

# Optional: subscribe to logs
provisioner.logger.subscribe(logger)

# Load your data from your data source
source_data_path: Path = Path("path/to/some.csv")
source_data_reader = DictReader(source_data_path.read_text().splitlines())
source_data = [row for row in source_data_reader]

# Validate your input data with
full_load_data: list[UserFullLoad] = UserFullLoad.from_list_of_dicts(
    source_data
)
provisioner.full_load(full_load_data)

Ready made scripts covering the basic use cases can be found here in the GoodData Productivity Tools repository

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gooddata_pipelines-1.47.1.dev1.tar.gz (167.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gooddata_pipelines-1.47.1.dev1-py3-none-any.whl (129.4 kB view details)

Uploaded Python 3

File details

Details for the file gooddata_pipelines-1.47.1.dev1.tar.gz.

File metadata

  • Download URL: gooddata_pipelines-1.47.1.dev1.tar.gz
  • Upload date:
  • Size: 167.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.23

File hashes

Hashes for gooddata_pipelines-1.47.1.dev1.tar.gz
Algorithm Hash digest
SHA256 6d76efe1e5aff982d79cf1112814176fbaade62f0d6d113fcdf67f2f8a4442b3
MD5 d35774b1b164dd16d2353524328624ad
BLAKE2b-256 f1d64067304185e9a2fad5eff5640cf2ef4070df404e6c9fa9418f24f8642cdb

See more details on using hashes here.

File details

Details for the file gooddata_pipelines-1.47.1.dev1-py3-none-any.whl.

File metadata

File hashes

Hashes for gooddata_pipelines-1.47.1.dev1-py3-none-any.whl
Algorithm Hash digest
SHA256 ff7768eb514c5b078f26da9d876e615e6dc29c38d4832eccc4429d362310fb68
MD5 2e25ee859504bf9b69865a3efa03125e
BLAKE2b-256 2b21e64bfa79586b46379364d5adb8400289eb9aac059775ef9b6c0e9c1377b9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page