Skip to main content

Enables execution of Ansible Task from Airflow

Project description

airflow-ansible-provider

An Airflow Ansible provider

Getting started

Please note that this file is no substitute for reading and understanding the Airflow documentation. This file is only intended to provide a quick start for the Ansible providers. Unless an issue relates specifically to the Ansible providers, the Airflow documentation should be consulted.

Install Airflow

Follow instructions at https://airflow.apache.org/docs/apache-airflow/stable/installation/index.html to install Airflow. If you just want to evaluate the SAS providers, then the simplest path would be to install via PYPI and run Airflow on the local machine in a virtual environment.

Install the Ansible provider

If you want to build the package from these sources, install the build module using pip install build and then run python -m build from the root of the repository which will create a wheel file in the dist subdirectory.

Installing in a local virtual environment

The Ansible provider is available as a package published in PyPI. To install it, switch to the Python environment where Airflow is installed, and run the following command:

pip install airflow-ansible-provider

If you would like to install the provider from a package you built locally, run:

pip install dist/airflow_ansible_provider_xxxxx.whl

Installing in a container

There are a few ways to provide the package:

  • Environment variable: _PIP_ADDITIONAL_REQUIREMENTS Set this variable to the command line that will be passed to pip install
  • Create a dockerfile that adds the pip install command to the base image and edit the docker-compose file to use "build" (there is a comment in the docker compose file where you can change it)

Running a DAG with a Ansible provider

See example files in the src/airflow_ansible_provider/example_dags directory. These dags can be modified and placed in your Airflow dags directory.

Mac note: If you are running Airflow standalone on a Mac, there is a known issue regarding how process forking works. This causes issues with the urllib which is used by the operator. To get around it set NO_PROXY=* in your environment prior to running Airflow in standalone mode. Eg: export NO_PROXY="*"

Contributing

We welcome your contributions! Please read CONTRIBUTING.md for details on how to submit contributions to this project.

License

This project is licensed under the Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_ansible_provider-0.1.0.tar.gz (19.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_ansible_provider-0.1.0-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file airflow_ansible_provider-0.1.0.tar.gz.

File metadata

File hashes

Hashes for airflow_ansible_provider-0.1.0.tar.gz
Algorithm Hash digest
SHA256 36bd9a9cc3c4d58bc470251a963d16422168ee7a12f2f7237182f1721394f2c4
MD5 739b10eee68cd6f0530f53a8ab9b911a
BLAKE2b-256 9d4cd6fbb9d8a8748cdb6434fc6cb48758676166733030c76007808e8cea2ce4

See more details on using hashes here.

File details

Details for the file airflow_ansible_provider-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_ansible_provider-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2715caf6bdc04528e6fbed741cb6ab74d5d15da9d14b32883a9b8d727b29cc0c
MD5 fd42b7344773e1d60339f851ed16a0e5
BLAKE2b-256 20e83a34e6217d24ed8113ba870ba9f7d852530a92ff977482cb76723affee20

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page