No project description provided
Project description
Snowpark Session Util
A simple utility for spark and mlflow session objects
Setup
Quick Install
python -m pip install snowpark_session
Build from source
Clone the repository
git clone https://github.com/Broomva/snowpark_session.git
Install the package
cd snowpark_session && make install
Build manually
After cloning, create a virtual environment
conda create -n snowpark_session python=3.8
conda activate snowpark_session
conda install snowflake-snowpark-python pandas
Install the requirements
pip install -r requirements.txt
Run the python installation
python setup.py install
Usage
The deployment requires a .env file created under local folder:
touch .env
It should have a schema like this:
databricks_experiment_name=''
databricks_experiment_id=''
databricks_host=''
databricks_token=''
databricks_username=''
databricks_password=''
databricks_cluster_id=''
import snowpark_session
# Create a Spark session
spark = DatabricksSparkSession().get_session()
# Connect to MLFLow Artifact Server
mlflow_session = DatabricksMLFlowSession().get_session()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
snowpark_session-0.1.1.tar.gz
(119.0 kB
view hashes)
Built Distribution
Close
Hashes for snowpark_session-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6e9879bebc11e68a06520a6b3c3ea1bb9edb9eb3d9cd7f09ddaa5313367552af |
|
MD5 | 5d33cbd9ad9a8cb31c2e02ef6533865f |
|
BLAKE2b-256 | ebe07c5ae970fbee1c75c6fdb473907cb2dd635864cfd45897e8457e6caedb1b |