GDMO native classes for standardized interaction with data objects within Azure Databricks. Contains TimeSeriesForecasting, APIRequest, Landing, and Delta functions.
Project description
gdmo
GDMO native classes for standardized interaction with data objects within Azure Databricks
This custom library allows our engineering team to use standardized packages that strip away a load of administrative and repetitive tasks from their daily object interactions. The current classes supported (V0.1.0) are:
Installation
Install this library using pip
:
pip install gdmo
Usage
Forecast - Forecast
Standardized way of forecasting a dataset. Input a dataframe with a Series, a Time, and a Value column, and see the function automatically select the right forecasting model and generate an output.
Example usage:
from gdmo import TimeSeriesForecast
forecaster = TimeSeriesForecast(spark, 'Invoiced Revenue')\
.set_columns('InvoiceDate', 'ProductCategory', 'RevenueUSD')\
.set_forecast_length(forecast_length)\
.set_last_data_point(lastdatamonth)\
.set_input(df)\
.set_growth_cap(0.02)\
.set_use_cap_growth(True)\
.set_modelselection_breakpoints(12, 24)\
.set_track_outcome(False)\
.build_forecast()
forecaster.inspect_forecast()
API - APIRequest
Class to perform a standard API Request using the request library, which allows a user to just add their endpoint / authentication / method data, and get the data returned without the need of writing error handling or need to understand how to properly build a request.
Example usage:
request = APIRequest(uri)\
.set_content_type('application/json') \
.set_header('bearer xxxxx') \
.set_method('GET') \
.set_parameters({"Month": "2024-01-01"})\
.make_request()
response = request.get_json_response()
display(response)
Tables - Landing
A class for landing API ingests and other data into Azure Data Lake Storage (ADLS). Currently can ingest Sharepoint data and JSON (API-sourced) data.
Example usage to ingest files from Sharepoint folder:
environment = 'xxxxx' #Databricks catalog
Sharepointsite = 'xxxxx'
UserName = 'xxxxx'
Password = 'xxxxx'
Client_ID = 'xxxxx'
adls_temp = 'xxxxx'
sharepoint = Landing(spark, dbutils, database="xxx", bronze_table="xxx", catalog=environment, container='xxx')\
.set_tmp_file_location(adls_temp)\
.set_sharepoint_location(Sharepointsite)\
.set_sharepoint_auth(UserName, Password, Client_ID)\
.set_auto_archive(False)\
.get_all_sharepoint_files()
Example usage to ingest JSON content from an API:
#Sample API request using the APIRequest class
uri = 'xxxxx'
request = APIRequest(uri).make_request()
response = request.get_json_response()
#Initiate the class, tell it where the bronze table is located, load configuration data for that table (required for delta merge), add the JSON to the landing area in ADLS, then put the landed data into a bronze delta table in the databricks catalog.
landing = Landing(spark, dbutils, database="xxx", bronze_table="xxx", target_folder=location, filename=filename, catalog=environment, container='xxx')\
.set_bronze(bronze)\
.set_config(config)\
.put_json_content(response)\
.put_bronze()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file gdmo-0.0.21.tar.gz
.
File metadata
- Download URL: gdmo-0.0.21.tar.gz
- Upload date:
- Size: 33.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5aa58443a4f139aab4108eaa1779ada179e6306cf8d1436b1de60dcec83d0806 |
|
MD5 | 08b7f33884510be9c74f3b9477c1670c |
|
BLAKE2b-256 | de1f76ed25894e3824fca2cd5d7efe0b59b48cb9d2d947cc74bafda7dc147aa6 |
File details
Details for the file gdmo-0.0.21-py3-none-any.whl
.
File metadata
- Download URL: gdmo-0.0.21-py3-none-any.whl
- Upload date:
- Size: 32.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fef78bc2e14fabbbccb6a3efdb3d2d86883a8bfb02fc54fc1fae64a129000886 |
|
MD5 | 5ac1a267ec0dae9e035eec9dae526691 |
|
BLAKE2b-256 | 80851dd4acfa899eabecfcdf5b9c49548c5933b5f882f548b01d5f88afec3bbe |