Skip to main content

spark_plotting_tools

Project description

spark_plottingtools

Github License Updates Python 3 Code coverage

spark_plotting_tools is a Python library that implements for dummy table

Installation

The code is packaged for PyPI, so that the installation consists in running:

pip install spark-plotting-tools --user 

Usage

wrapper take Dummy

from spark_plotting_tools import generated_dummy_table_artifactory
from spark_plotting_tools import generated_dummy_table_datum
import spark_dataframe_tools


Generated Dummy Table Datum
============================================================
path = "fields_pe_datum2.csv"
table_name = "t_kctk_collateralization_atrb"
storage_zone = "master"
sample_parquet = 10
columns_integer_default={}
columns_date_default={"gf_cutoff_date":"2026-01-01"}
columns_string_default={}
columns_decimal_default={"other_concepts_amount":"500.00"}

generated_dummy_table_datum(spark=spark,
                            path=path,
                            table_name=table_name,
                            storage_zone=storage_zone,
                            sample_parquet=sample_parquet,
                            partition_colum=["gf_cutoff_date"],
                            columns_integer_default=columns_integer_default,
                            columns_date_default=columns_date_default,
                            columns_string_default=columns_string_default,
                            columns_decimal_default=columns_decimal_default
                           )
                       



Generated Dummy Table Artifactory
============================================================
path = "lclsupplierspurchases.output.schema"
sample_parquet = 10
columns_integer_default={}
columns_date_default={"gf_cutoff_date":"2026-01-01"}
columns_string_default={}
columns_decimal_default={"other_concepts_amount":"500.00"}


generated_dummy_table_artifactory(spark=spark,
                                  path=path,
                                  sample_parquet=sample_parquet,
                                  columns_integer_default=columns_integer_default,
                                  columns_date_default=columns_date_default,
                                  columns_string_default=columns_string_default,
                                  columns_decimal_default=columns_decimal_default
                                 )










import os, sys
is_windows = sys.platform.startswith('win')
path_directory = os.path.join("DIRECTORY_DUMMY", table_name)
if is_windows:
    path_directory = path_directory.replace("\\", "/")
    

df =  spark.read.parquet(path_directory)
df.show2(10)
  

License

Apache License 2.0.

New features v1.0

BugFix

  • choco install visualcpp-build-tools

Reference

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spark_plotting_tools-0.3.0.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

spark_plotting_tools-0.3.0-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file spark_plotting_tools-0.3.0.tar.gz.

File metadata

  • Download URL: spark_plotting_tools-0.3.0.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.1

File hashes

Hashes for spark_plotting_tools-0.3.0.tar.gz
Algorithm Hash digest
SHA256 2b0d31f9aaa036a938f8b2cc6c8d6e34b2248e63021e65a8a39e263b383b020f
MD5 a0f7262949f9233d6a7df196ce1e1839
BLAKE2b-256 feefe246ef7fa2bb3ca33c9a736b380a037c4b2da0cb33a8fad5e24d0d300914

See more details on using hashes here.

File details

Details for the file spark_plotting_tools-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for spark_plotting_tools-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ea054db9d6c2de11162a76f6d71419446dc188fda228d55e7ad32aad61c1b0b6
MD5 056077fe199878b1fd834a2bd53e4526
BLAKE2b-256 eeca53f3452c6d9c2874f929714e7764346f4c705452cbe7922d5c4f2c2b1b13

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page