Skip to main content

Enables execution of Ansible Task from Airflow

Project description

airflow-ansible-provider

An Airflow Ansible provider

Getting started

Please note that this file is no substitute for reading and understanding the Airflow documentation. This file is only intended to provide a quick start for the Ansible providers. Unless an issue relates specifically to the Ansible providers, the Airflow documentation should be consulted.

Install Airflow

Follow instructions at https://airflow.apache.org/docs/apache-airflow/stable/installation/index.html to install Airflow. If you just want to evaluate the SAS providers, then the simplest path would be to install via PYPI and run Airflow on the local machine in a virtual environment.

User's Guide

Follow the documents. Docs

Install the Ansible provider

If you want to build the package from these sources, install the build module using pip install build and then run python -m build from the root of the repository which will create a wheel file in the dist subdirectory.

Installing in a local virtual environment

The Ansible provider is available as a package published in PyPI. To install it, switch to the Python environment where Airflow is installed, and run the following command:

pip install airflow-ansible-provider

If you would like to install the provider from a package you built locally, run:

pip install dist/airflow_ansible_provider_xxxxx.whl

Installing in a container

There are a few ways to provide the package:

  • Environment variable: _PIP_ADDITIONAL_REQUIREMENTS Set this variable to the command line that will be passed to pip install
  • Create a dockerfile that adds the pip install command to the base image and edit the docker-compose file to use "build" (there is a comment in the docker compose file where you can change it)

Running a DAG with a Ansible provider

See example files in the src/airflow_ansible_provider/example_dags directory. These dags can be modified and placed in your Airflow dags directory.

Mac note: If you are running Airflow standalone on a Mac, there is a known issue regarding how process forking works. This causes issues with the urllib which is used by the operator. To get around it set NO_PROXY=* in your environment prior to running Airflow in standalone mode. Eg: export NO_PROXY="*"

Contributing

We welcome your contributions! Please read CONTRIBUTING.md for details on how to submit contributions to this project.

License

This project is licensed under the Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_ansible_provider-0.5.0.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_ansible_provider-0.5.0-py3-none-any.whl (26.1 kB view details)

Uploaded Python 3

File details

Details for the file airflow_ansible_provider-0.5.0.tar.gz.

File metadata

File hashes

Hashes for airflow_ansible_provider-0.5.0.tar.gz
Algorithm Hash digest
SHA256 1ecce857681f91aa375505ff15d5c8fe815a216a49a971c3aa8c7f11f503fa0d
MD5 f141c3060b0e0aea00e26f9a3c5d7b6a
BLAKE2b-256 969ac9ae622bb53366822c8d4677b8c84d64bb2a04590a76c267c056f790cf4f

See more details on using hashes here.

File details

Details for the file airflow_ansible_provider-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_ansible_provider-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7d44f183710f7fdd353bff3c461b61c59764d9e38bace300c3e838f27e97e64d
MD5 e9d5824ba85ade1b7c5dc25a421db59f
BLAKE2b-256 431f23da2fc91a8b3b8b115982b20f6a1c13a8b466df08431a7ff6bd3cdfcff6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page