Lets Airflow DAGs run Spark jobs via Livy: sessions and/or batches.
Project description
Airflow Livy Operators
Lets Airflow DAGs run Spark jobs via Livy:
- Sessions,
- Batches. This mode supports additional verification via Spark/YARN REST API.
See this blog post for more information and detailed comparison of ways to run Spark jobs from Airflow.
Directories and files of interest
airflow_home/plugins
: Airflow Livy operators' code.airflow_home/dags
: example DAGs for Airflow.batches
: Spark jobs code, to be used in Livy batches.sessions
: Spark code for Livy sessions. You can add templates to files' contents in order to pass parameters into it.helper.sh
: helper shell script. Can be used to run sample DAGs, prep development environment and more. Run it to find out what other commands are available.
How do I...
...run the examples?
Prerequisites:
- Python 3. Make sure it's installed and in $PATH
- Spark cluster with Livy. I heavily recommend you "mock" one on your machine with my Spark cluster on Docker Compose.
Now,
- Optional - this step can be skipped if you're mocking a cluster on your
machine. Open helper.sh. Inside
init_airflow()
function you'll see Airflow Connections for Livy, Spark and YARN. Redefine as appropriate. - Define the way the sample batch files from this repo are delivered to a cluster:
- if you're using a docker-compose cluster: redefine the BATCH_DIR variable as appropriate.
- if you're using your own cluster: modify the
copy_batches()
function so that it delivers the files to a place accessible by your cluster (could beaws s3 cp
etc.)
- run
./helper.sh up
to bring up the whole infrastructure. Airflow UI will be available at localhost:8888. The credentials areadmin/admin
. - Ctrl+C to stop Airflow. Then
./helper.sh down
to dispose of remaining Airflow processes (shouldn't be required if everything goes well. Run this if you can't start Airflow again due to some non-informative errors) .
... use it in my project?
pip install airflow-livy-operators
This is how you import them:
from airflow_livy.session import LivySessionOperator
from airflow_livy.batch import LivyBatchOperator
See sample DAGs under airflow_home/dags
to learn how to use the operators.
... set up the development environment?
Alright, you want to contribute and need to be able to run the stuff on your machine, as well as the usual niceness that comes with IDEs (debugging, syntax highlighting).
./helper.sh updev
runs Airflow with local operators' code (as opposed to pulling them from PyPi). Useful for development../helper.sh full
- run tests (pytest) with coverage report (will be saved to htmlcov/), highlight code style errors (flake8), reformat all code (black + isort)./helper.sh ci
- same as above, but only check the code formatting. This same command is ran by CI.- (Pycharm-specific) point PyCharm to your newly-created virtual environment: go to
"Preferences" -> "Project: airflow-livy-operators" -> "Project interpreter", select "Existing environment"
and pick python3 executable from venv folder (venv/bin/python3)
... debug?
- (Pycharm-specific) Step-by-step debugging with
airflow test
and running PySpark batch jobs locally (with debugging as well) is supported via run configurations under.idea/runConfigurations
. You shouldn't have to do anything to use them - just open the folder in PyCharm as a project. - An example of how a batch can be ran on local Spark:
python ./batches/join_2_files.py \
"file:////Users/vpanov/data/vpanov/bigdata-docker-compose/data/grades.csv" \
"file:///Users/vpanov/data/vpanov/bigdata-docker-compose/data/ssn-address.tsv" \
-file1_sep=, -file1_header=true \
-file1_schema="\`Last name\` STRING, \`First name\` STRING, SSN STRING, Test1 INT, Test2 INT, Test3 INT, Test4 INT, Final INT, Grade STRING" \
-file1_join_column=SSN -file2_header=false \
-file2_schema="\`Last name\` STRING, \`First name\` STRING, SSN STRING, Address1 STRING, Address2 STRING" \
-file2_join_column=SSN -output_header=true \
-output_columns="file1.\`Last name\` AS LastName, file1.\`First name\` AS FirstName, file1.SSN, file2.Address1, file2.Address2"
# Optionally append to save result to file
#-output_path="file:///Users/vpanov/livy_batch_example"
TODO
- helper.sh - replace with modern tools (e.g. pipenv + Docker image)
- Disable some of flake8 flags for cleaner code
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file airflow-livy-operators-0.6.tar.gz
.
File metadata
- Download URL: airflow-livy-operators-0.6.tar.gz
- Upload date:
- Size: 12.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.7.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b6b2c4725fda3e4095992594cbab02e97b2df3612a2529b695958e3957831fe2 |
|
MD5 | 2becd7f6ad7095c1adcde1d13d4b551e |
|
BLAKE2b-256 | fe7210747589f96098fb8073b497405f8d0d2a420c56d116c013f1e791a7fc62 |
File details
Details for the file airflow_livy_operators-0.6-py3-none-any.whl
.
File metadata
- Download URL: airflow_livy_operators-0.6-py3-none-any.whl
- Upload date:
- Size: 12.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.7.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.2 CPython/3.7.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ce1f7e92722acd810f0d3b6f5740242b4234a81230239ea60fbc44538a564efb |
|
MD5 | f7aeb65532e9ae3901c1ea6d69564cad |
|
BLAKE2b-256 | 1c3b70aef90a55e92e6f68f9a03798602f37532cbc2f6a6f65007bf70e1bfd38 |