Skip to main content

to expand the functionality of Tableau prep

Project description

prepextend

A toolset to expand the functionality for the Tableau Prep

What are the Key Features?

  1. Run Tableau Prep file through Python API.
  2. Read the Tableau Prep file as a dict, which has an overview of all inputs and outputs.
  3. Produce a flows execution sequential list from a group of the Tableau prep files with dependence.
  4. Add-on features (require setting the config file):
    • 4.a. On top of feature 1, add retry when there is an error in running Tableau Prep file
    • 4.b. On top of feature 1, notify by Slack when running error.
    • 4.c. On top of feature 1, assign a specific version of Tableau Prep in a specific file to run.
    • 4.d. On top of feature 1, save running log.
    • 4.e. On top of feature 1, no more need to locate the credential file
    • 4.f. On top of feature 3, can assign a folder as a pool to the group of the Tableau prep files located.
    • 4.g. Can mark csv outputs as checkpoints and verify them after running file to secure the data quality.
    • 4.h. On top of feature 4-g, notify by Slack when getting issues in checkpoints.

Please Note!

  • Featurs 2 & 3 are in the limited scope of testing:
    • Local file type: excel, csv, hyper
    • Non-Local: Postgres DB, Snowflake, Tableau Server
    • If your Tableau Prep files contain some connection out of the above scope, might get error or incorrect result.

What's the Restriction?

Currently, Windows OS support only.

How to Install?

Use the package manager pip to install.

pip install prepextend

How to Use?

1. Run Tableau Prep file

# returns runnning log 
from prepextend import flow_run

running_log = flow_run(prep_script = "[Tableau Prep Builder install location]\Tableau Prep Builder <version>\scripts",
                       flow_file = "path\to\[your flow file name].tfl", 
                       credential_file = "path\to\[your credential file name].json"
                       )
# eg. prep_script = "...\Tableau\Tableau Prep Builder 2020.2\scripts"

2. Read the Tableau Prep file

# returns flow_info 
from prepextend import flow_read

flow_info = flow_read("path\to\[your flow file name].tfl")

3. Produce a flows execution sequential list with organized dependence.

from prepextend import flows_roadmap

depend_flows = ['...\flow_3.tfl', '...\flow_1.tfl', '...\flow_2.tfl']
# dependency = flow 1 -> flow 2 -> flow 3

target_flows = [
    '...\flow_3.tfl',
    ]

flow_map = flows_roadmap(target_flows, depend_flows)
# flow_map will be ['...\flow_1.tfl', '...\flow_2.tfl', '...\flow_3.tfl']

target_flows = [
    '...\flow_2.tfl',
    ]

flow_map = flows_roadmap(target_flows, depend_flows)
#  flow_map will be ['...\flow_1.tfl', '...\flow_2.tfl']

4. Set config for add-on features

from prepextend import flow_manage

flow_management = flow_manage(general_config, version_assigned)

4.a ~ 4.e, 4.h Run flow with add-on features

running_log = flow_management.run_flow("path\to\[your flow file name].tfl")

4.f When produce flow_map, apply the assign a folder as a pool to the group of the Tableau prep files located without input depend_flows

target_flows = [
    '...\flow_3.tfl',
    ]

flow_map = flow_management.flows_roadmap(target_flows)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prepextend-2021.0.1.tar.gz (13.3 kB view details)

Uploaded Source

File details

Details for the file prepextend-2021.0.1.tar.gz.

File metadata

  • Download URL: prepextend-2021.0.1.tar.gz
  • Upload date:
  • Size: 13.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for prepextend-2021.0.1.tar.gz
Algorithm Hash digest
SHA256 264b195fb0f93cc6a085aa6a1f56bc5a2e723d931d463283019ee4e3cac840f1
MD5 c8b81e846e5d22a27a51bcd519d1d118
BLAKE2b-256 920682a55866b724760dee4dd91dade15b26ce0dc537310a66b0f711ec0b9de6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page