Skip to main content tap for extracting data from Pagerduty v2 API

Project description


PyPI version License: GPL v3 Python Versions Build Status

A Singer tap for extracting data from the Pagerduty REST API v2.


Since package dependencies tend to conflict between various taps and targets, Singer recommends installing taps and targets into their own isolated virtual environments:

Install Pagerduty Tap

$ cd tap-pagerduty
$ python3 -m venv ~/.venvs/tap-pagerduty
$ source ~/.venvs/tap-pagerduty/bin/activate
$ pip3 install tap-pagerduty
$ deactivate

Install Singer Target

$ python3 -m venv ~/.venvs/target-stitch
$ source ~/.venvs/target-stitch/bin/activate
$ pip3 install target-stitch
$ deactivate


The tap accepts a JSON-formatted configuration file as arguments. This configuration file has three required fields:

  1. token: A valid Pagerduty REST API key.
  2. email: A valid email address to be inserted into the From header of the HTTP Request headers
  3. since A date to be used as the default since parameter for all API endpoints that support that parameter.

An bare-bones Pagerduty confirguration may file may look like the following:

  "token": "foobarfoobar",
  "email": "",
  "since": "2019-01-01"

Additionally, you may specify more granular configurations for individual streams. Each key under a stream should represent a valid API request parameter for that endpoint. A more fleshed-out configuration file may look similar to the following:

  "token": "foobarfoobar",
  "email": "",
  "since": "2019-08-01",
  "streams": {
    "incidents": {
      "since": "last_status_change_at>=2019-08-01",
      "sort_by": "created_at:asc"


The current version of the tap syncs three distinct Streams:

  1. Incidents: (Endpoint, Schema)
  2. Notifications: (Endpoint, Schema)
  3. Services: (Endpoint, Schema)


Singer taps describe the data that a stream supports via a Discovery process. You can run the Pagerduty tap in Discovery mode by passing the --discover flag at runtime:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --discover

The tap will generate a Catalog to stdout. To pass the Catalog to a file instead, simply redirect it to a file:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --discover > catalog.json

Sync Locally

Running a tap in Sync mode will extract data from the various Streams. In order to run a tap in Sync mode, pass a configuration file and catalog file:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --catalog=catalog.json

The tap will emit occasional State messages. You can persist State between runs by redirecting State to a file:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --catalog=catalog.json >> state.json
$ tail -1 state.json > state.json.tmp
$ mv state.json.tmp state.json

To pick up from where the tap left off on subsequent runs, simply supply the State file at runtime:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --catalog=catalog.json --state=state.json >> state.json
$ tail -1 state.json > state.json.tmp
$ mv state.json.tmp state.json

Sync to Stitch

You can also send the output of the tap to Stitch Data for loading into the data warehouse. To do this, first create a JSON-formatted configuration for Stitch. This configuration file has two required fields:

  1. client_id: The ID associated with the Stitch Data account you'll be sending data to.
  2. token The token associated with the specific Import API integration within the Stitch Data account.

An example configuration file will look as follows:

  "client_id": 1234,
  "token": "foobarfoobar"

Once the configuration file is created, simply pipe the output of the tap to the Stitch Data target and supply the target with the newly created configuration file:

$ ~/.venvs/tap-pagerduty/bin/tap-pagerduty --config=config/pagerduty.config.json --catalog=catalog.json --state=state.json | ~/.venvs/target-stitch/bin/target-stitch --config=config/stitch.config.json >> state.json
$ tail -1 state.json > state.json.tmp
$ mv state.json.tmp state.json


The first step to contributing is getting a copy of the source code. First, fork tap-pagerduty on GitHub. Then, cd into the directory where you want your copy of the source code to live and clone the source code:

$ git clone

Now that you have a copy of the source code on your local machine, you can leverage Pipenv and the corresponding Pipfile to install of the development dependencies within a virtual environment:

$ pipenv install --three --dev

This command will create an isolated virtual environment for your tap-pagerduty project and install all the development dependencies defined within the Pipfile inside of the environment. You can then enter a shell within the environment:

$ pipenv shell

Alternatively, you can run individual commands within the environment without entering the shell:

$ pipenv run <command>

For example, to format your code using isort and flake8 before commiting changes, run the following commands:

$ pipenv run make isort
$ pipenv run make flake8

You can also run the entire testing suite before committing using tox:

$ pipenv run tox

Finally, you can run your local version of the tap within the virtual environment using a command like the following:

$ pipenv run tap-pagerduty --config=config/pagerduty.config.json --catalog=catalog.json

Once you've confirmed that your changes work and the testing suite passes, feel free to put out a PR!

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for tap-pagerduty, version 0.1.0rc2
Filename, size File type Python version Upload date Hashes
Filename, size tap_pagerduty-0.1.0rc2-py2.py3-none-any.whl (22.3 kB) File type Wheel Python version py2.py3 Upload date Hashes View hashes
Filename, size tap_pagerduty-0.1.0rc2.tar.gz (11.4 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page