This script takes component urls as input and pushes data to snappyflow under plugin etl
Project description
Python sfapmetl feature
Installation
$ pip install sfapmetl
Usage
$ sf-apm-etl <config file path>
- Please provide config as mentioned below:
key: <profilekey>
tags:
Name: <name>
appName: <appName>
projectName: <projectName>
metrics:
plugins:
- name: etl #Plugin name is etl
enabled: true
document_type: <documentType>
url:
job: <job_url> #component url of job table
stage: <stage_url> #component url of stage table
task: <task_url> #component url of task table
authkey: <authentication_key_for_the_urls>
buffer_path: <path for bufferfile.json> #provide path for bufferfile which keeps record of unfinished job,stage and tasks.
-
After this setup, add cronjob into /etc/crontab( Applicable for Linux AWS instance, else run this script as a cron job) ex: To run script every 5 minutes - */5 * * * * root sf-apm-etl
-
please refer this link for cronjob https://www.digitalocean.com/community/tutorials/how-to-use-cron-to-automate-tasks-ubuntu-1804
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
sfapmetl-0.1.3.tar.gz
(8.5 kB
view hashes)
Built Distribution
Close
Hashes for sfapmetl-0.1.3-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9aa9fb9e468ac080d0c4c1bfbbe875c64e44893171117fcbe87a9d5e8b7c016c |
|
MD5 | f9888df9eca93865173e3d5960e1c7e3 |
|
BLAKE2b-256 | 0b4d6a3432f3b24867440c0cc538291f15bb163ac300de471f475557cd968cde |