Minimalistic utility library to manage conda environments for pyspark jobs on yarn clusters
Project description
Minimalistic utility library to manage conda environments for PySpark jobs on yarn clusters.
Features
Manage conda environments on PySpark executors to use specific packages on the remote workers without involving admins to install needed software on hadoop cluster.
Docs
History
0.2.2 (2016-02)
Fix bug when number of cores per executor is larger than 1
Fix list_cwd to include hostname in output
Fix tests to use pack files naming instead of zip files
Fix usage docs to include new config for number of cores
Add tests for correct prun to num partition mapping
0.2.1 (2016-01)
Fix documentation, setup and usage methods
Fix travis and setup.py configs
Move zip to pack as an action
Replace zip with tar to better preserve acls on conda env files
Add configuration of error level for untar
0.2.0 (2016-01)
Additional tests for the distributed version of remote package delivery
Changed os files management to python based support (zip, rm)
Use SparkFiles to detect files distributed to the workers.
0.1.0 (2015-11)
Initial version to manage the conda environments on pyspark cluster workers without involving your cluster admins too heavily.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file sparkonda-0.2.2.tar.gz
.
File metadata
- Download URL: sparkonda-0.2.2.tar.gz
- Upload date:
- Size: 16.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b0458fa4a9abec6a42244eb5b157bb3d5312ca90b7162101fd9081daa11d6288 |
|
MD5 | b0edc9e7148cdf37738648b6c7c3db10 |
|
BLAKE2b-256 | 42a9e060787a4afb5a7efe6c1e59765362711f9819a4d4392b26432deb287fd6 |