Skip to main content

Data Cloud Custom Code SDK

Project description

Data Cloud Custom Code SDK

license

This package provides a development kit for creating custom data transformations in Data Cloud. It allows you to write your own data processing logic in Python while leveraging Data Cloud's infrastructure for data access and running data transformations, mapping execution into Data Cloud data structures like Data Model Objects and Data Lake Objects.

More specifically, this codebase gives you ability to test code locally before pushing to Data Cloud's remote execution engine, greatly reducing how long it takes to develop.

Use of this project with Salesforce is subject to the TERMS OF USE

Installation

The SDK can be downloaded directly from PyPI with pip:

pip install salesforce-data-customcode

You can verify it was properly installed via CLI:

datacustomcode version

Development Setup

We offer two built-in development interfaces: devcontainers and Jupyter, but you can set up any tool you would like manually.

To get started, use the CLI to initialize a new development environment:

datacustomcode init [DIRECTORY TO DUMP NEW REPO]

This will yield all necessary files to get started:

.
├── Dockerfile
├── README.md
├── requirements.txt
├── requirements-dev.txt
├── payload
│   ├── config.json
│   ├── entrypoint.py
├── jupyterlab.sh
└── requirements.txt
  • Dockerfile (Do not update) – Development container emulating the remote execution environment.
  • requirements-dev.txt (Do not update) – These are the dependencies for the development environment.
  • jupyterlab.sh (Do not update) – Helper script for setting up Jupyter.
  • requirements.txt – Here you define the requirements that you will need remotely
  • payload – This folder will be compressed and deployed to the remote execution environment.
    • config.json – This config defines permissions on the back and can be generated programmatically with scan CLI method.
    • entrypoint.py – The script that defines the data transformation logic.

API

You entry point script will define logic using the Client object which wraps data access layers.

You should only need the following methods:

  • read_dlo(name) – Read from a Data Lake Object by name
  • read_dmo(name) – Read from a Data Model Object by name
  • write_to_dlo(name, spark_dataframe, write_mode) – Write to a Data Model Object by name with a Spark dataframe
  • write_to_dmo(name, spark_dataframe, write_mode) – Write to a Data Lake Object by name with a Spark dataframe

For example:

from datacustomcode import Client

client = Client()

sdf = client.read_dlo('my_DLO')
# some transformations
# ...
client.write_to_dlo('output_DLO')

[!WARNING] Currently we only support reading from DMOs and writing to DMOs or reading from DLOs and writing to DLOs, but they cannot mix.

CLI

The Data Cloud Custom Code SDK provides a command-line interface (CLI) with the following commands:

Global Options

  • --debug: Enable debug-level logging

Commands

datacustomcode version

Display the current version of the package.

datacustomcode configure

Configure credentials for connecting to Data Cloud.

Options:

  • --profile TEXT: Credential profile name (default: "default")
  • --username TEXT: Salesforce username
  • --password TEXT: Salesforce password
  • --client-id TEXT: Connected App Client ID
  • --client-secret TEXT: Connected App Client Secret
  • --login-url TEXT: Salesforce login URL

datacustomcode deploy

Deploy a transformation job to Data Cloud.

Options:

  • --profile TEXT: Credential profile name (default: "default")
  • --path TEXT: Path to the code directory (default: ".")
  • --name TEXT: Name of the transformation job [required]
  • --version TEXT: Version of the transformation job (default: "0.0.1")
  • --description TEXT: Description of the transformation job (default: "")

datacustomcode init

Initialize a new development environment with a template.

Argument:

  • DIRECTORY: Directory to create project in (default: ".")

datacustomcode scan

Scan a Python file to generate a Data Cloud configuration.

Argument:

  • FILENAME: Python file to scan

Options:

  • --config TEXT: Path to save the configuration file (default: same directory as FILENAME)
  • --dry-run: Preview the configuration without saving to a file

datacustomcode run

Run an entrypoint file locally for testing.

Argument:

  • ENTRYPOINT: Path to entrypoint Python file

Options:

  • --config-file TEXT: Path to configuration file
  • --dependencies TEXT: Additional dependencies (can be specified multiple times)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

salesforce_data_customcode-0.1.2.tar.gz (24.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

salesforce_data_customcode-0.1.2-py3-none-any.whl (38.7 kB view details)

Uploaded Python 3

File details

Details for the file salesforce_data_customcode-0.1.2.tar.gz.

File metadata

  • Download URL: salesforce_data_customcode-0.1.2.tar.gz
  • Upload date:
  • Size: 24.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.13.3 Darwin/23.6.0

File hashes

Hashes for salesforce_data_customcode-0.1.2.tar.gz
Algorithm Hash digest
SHA256 15d0d92c323038cb0141f31dfa6d10617b77127812d8b265b618d77ff5bc5e43
MD5 f69e5f47adf8d1a3237b5aad004bd859
BLAKE2b-256 ae1be50ac7ebf30016cdd9e64eaa6698d6b0649edb2298930d562f2cbb86d77b

See more details on using hashes here.

File details

Details for the file salesforce_data_customcode-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for salesforce_data_customcode-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a0385a1c13e2a816ce6e96a806e86abad2dc8c94ee5b7d6f63fbbaf0ba910d4b
MD5 c0e50835fead97e5012f10adc52c95e0
BLAKE2b-256 ae9f505815b0736abf143ad4a0dc68ab11accd262912106103f74ff9fb43e9e3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page