This script takes component urls as input and pushes data to snappyflow under plugin etl
Project description
Python sfapmetl feature
Installation
$ pip install sfapmetl
Usage
$ sf-apm-etl <config file path>
- Please provide config as mentioned below:
key: <profilekey>
tags:
Name: <name>
appName: <appName>
projectName: <projectName>
metrics:
plugins:
- name: etl #Plugin name is etl
enabled: true
document_type: <documentType>
url:
job: <job_url> #component url of job table
stage: <stage_url> #component url of stage table
task: <task_url> #component url of task table
authkey: <authentication_key_for_the_urls>
buffer_path: <path for bufferfile.json> #provide path for bufferfile which keeps record of unfinished job,stage and tasks.
-
After this setup, add cronjob into /etc/crontab( Applicable for Linux AWS instance, else run this script as a cron job) ex: To run script every 5 minutes - */5 * * * * root sf-apm-etl
-
please refer this link for cronjob https://www.digitalocean.com/community/tutorials/how-to-use-cron-to-automate-tasks-ubuntu-1804
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
sfapmetl-0.1.1.tar.gz
(8.5 kB
view hashes)
Built Distribution
Close
Hashes for sfapmetl-0.1.1-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ded872de502f17d935368f8364bf0c2a7c1a42addca388044c0a1aa0a78747ba |
|
MD5 | af9107eac8cad869de414ea8da74d2f2 |
|
BLAKE2b-256 | 28be9e4cea482baa85d17936c954118caf40aaf6c1ef6baa16a27cf6e91214e5 |