Skip to main content

GoodData Cloud lifecycle automation pipelines

Project description

GoodData Pipelines

A high-level library for automating the lifecycle of GoodData Cloud (GDC).

You can use the package to manage following resources in GDC:

  1. Provisioning (create, update, delete)
    • User profiles
    • User Groups
    • User/Group permissions
    • User Data Filters
    • Child workspaces (incl. Workspace Data Filter settings)
  2. [PLANNED]: Backup and restore of workspaces
  3. [PLANNED]: Custom fields management
    • extend the Logical Data Model of a child workspace

In case you are not interested in incorporating a library in your own program but would like to use a ready-made script, consider having a look at GoodData Productivity Tools.

Provisioning

The entities can be managed either in full load or incremental way.

Full load means that the input data should represent the full and complete desired state of GDC after the script has finished. For example, you would include specification of all child workspaces you want to exist in GDC in the input data for workspace provisioning. Any workspaces present in GDC and not defined in the source data (i.e., your input) will be deleted.

On the other hand, the incremental load treats the source data as instructions for a specific change, e.g., a creation or a deletion of a specific workspace. You can specify which workspaces you would want to delete or create, while the rest of the workspaces already present in GDC will remain as they are, ignored by the provisioning script.

The provisioning module exposes Provisioner classes reflecting the different entities. The typical usage would involve importing the Provisioner class and the data input data model for the class and planned provisioning method:

import os
from csv import DictReader
from pathlib import Path

# Import the Entity Provisioner class and corresponding model from gooddata_pipelines library
from gooddata_pipelines import UserFullLoad, UserProvisioner
from gooddata_pipelines.logger.logger import LogObserver

# Optionally, subscribe a standard Python logger to the LogObserver
import logging
logger = logging.getLogger(__name__)
LogObserver().subscribe(logger)

# Create the Provisioner instance - you can also create the instance from a GDC yaml profile
provisioner = UserProvisioner(
    host=os.environ["GDC_HOSTNAME"], token=os.environ["GDC_AUTH_TOKEN"]
)

# Load your data from your data source
source_data_path: Path = Path("path/to/some.csv")
source_data_reader = DictReader(source_data_path.read_text().splitlines())
source_data = [row for row in source_data_reader]

# Validate your input data with
full_load_data: list[UserFullLoad] = UserFullLoad.from_list_of_dicts(
    source_data
)
provisioner.full_load(full_load_data)

Bugs & Requests

Please use the GitHub issue tracker to submit bugs or request features.

Changelog

See Github releases for released versions and a list of changes.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gooddata_pipelines-1.50.1.dev2.tar.gz (139.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gooddata_pipelines-1.50.1.dev2-py3-none-any.whl (141.7 kB view details)

Uploaded Python 3

File details

Details for the file gooddata_pipelines-1.50.1.dev2.tar.gz.

File metadata

  • Download URL: gooddata_pipelines-1.50.1.dev2.tar.gz
  • Upload date:
  • Size: 139.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gooddata_pipelines-1.50.1.dev2.tar.gz
Algorithm Hash digest
SHA256 f6f9890a15be0d4e896796af4c04ed5c6a4856ae3a92231d695184586b78a409
MD5 c2622983ff8021069bc3bfd430ba1f6f
BLAKE2b-256 d81eb6bcbeee74662c4e27bc7285c18d7d93f73a6fe21e85533ac6ab996f3350

See more details on using hashes here.

File details

Details for the file gooddata_pipelines-1.50.1.dev2-py3-none-any.whl.

File metadata

File hashes

Hashes for gooddata_pipelines-1.50.1.dev2-py3-none-any.whl
Algorithm Hash digest
SHA256 326c59a38e36284a95d0ad9a4cb07b0dcdb8e20478e5485cbe86b0f8f2e04df8
MD5 b9a80a2eec9b59ba863d3001fdedf2de
BLAKE2b-256 7bb78a83d153042ad4c797013bba4785a07066e4951d3a2bccf0c26d435e479b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page