Skip to main content

Kurtosis-based P and S wave picker

Project description

ps_picker

Seismological P- and S- wave picker using the modified Kurtosis method

Python port of the picker described in Baillard et al., 2014

debugging information is saved to the local file run_{datetime}.log

Methodology

The picker is based around the Kurtosis, but also uses energy levels, polarity, clustering and phase association in a 3-step process:

Step 1: define a global pick window

The Kurtosis is calculated for all stations. The global window surrounds the most densely clustered region of triggers.

Step 2: pick P and S arrivals on each station individually

For each station: - calculate the Kurtosis over coarse to fine scales. - Identify candidates on the coarse scale and refine their times using the finier scales - Choose P- and S- candidates based on the signal-to-noise level of each pick - Verify the candidates using the waveform polarity, if possible - polarity is only used if one of the picks has a dip of > 30 degrees

Step 3: associate picks

  • Calculate origin times for each trace, based on the P-S delay and a simple velocity model (could I use a single Vp/Vs value?)
  • If at least 3 origin times are clustered, use their average origin time to validate all candidates, possibly dipping into the pool of unused candidates for replacemene P and S picks
  • If less than 3 origin times are clustered, reject bad P- and S- picks based on clustering of P-pick times, S-pick times and P-S delays

Database and waveform files

Are assumed to be in SEISAN structure:

  • Database files: NORDIC format, in database_path_in/YEAR/MONTH/ (except run_one, for which the file may be local)
  • Waveform files: one miniseed file per event. Filename is read from the database file and assumed to start with YEAR-MONTH. File is read from waveform_path_in/YEAR/MONTH/

Example workflow

Start by autopicking a few events, with all bells and whistles on:

To pick one event from a database in /SEISAN/MAYOBS:

from pspicker import PSPicker
picker = PSPicker('parameters_C.yaml', '/SEISAN/MAYOBS/WAV/MAYOB',  '/SEISAN/MAYOBS/REA/MAYOB')
picker.run_one('19-0607-59L.S201905', plot_global=True, plot_stations=True, log_level='verbose')

Look at all of the plots and verify that the picks and association are as you expect. If not, change the paramters and run again.

Next, pick several events with only the global plots on

The bells and whistles text will be saved to a log file named run_{DATETIME}.log

To pick events from May 5th to 25th in the same database:

from pspicker import PSPicker
picker = PSPicker('parameters_C.yaml', '/SEISAN/MAYOBS/WAV/MAYOB',  '/SEISAN/MAYOBS/REA/MAYOB')
picker.run_many('20190505', '20190525', plot_global=True)

Finally, run the whole database without plots

(run_{DATETIME}.log is always created)

To pick events from May 26th 2019 May 1st 2020:

from pspicker import PSPicker
picker = PSPicker('parameters_C.yaml', '/SEISAN/MAYOBS/WAV/MAYOB', '/SEISAN/MAYOBS/REA/MAYOB')
picker.run_many('20190526', '20200501')

The three main methods:

def __init__(self, parm_file, wav_base_path, database_path_in,
             database_path_out='Sfile_directory', database_format='NORDIC'):
    """
    :param parm_file: path/name of the parameter file
    :param wav_base_path: absolute basepath to the waveform files (just before
                          the YEAR/MONTH subdirectories)
    :param database_path_in: absolute basepath to the database/catalog file(s)
                             (just before the YEAR/MONTH subdirectories)
    :param database_path_out: path to output database files
    :param database_format: 'NORDIC' is the only choice for now
        'NORDIC': Use SEISAN conventions for waveform  and database files
                  (naming, and location in YEAR/MONTH subdirectories)
    """
def run_one(self, database_filename, plot_global=True, plot_stations=False,
            assoc=None, log_level="verbose", plot_debug=None):
    """
    Picks P and S arrivals on one waveform, using the Kurtosis

    Information in the database file will be appended with the picks.
    :param database_filename: database file to read
    :param plot_global: show global and overall pick plots
    :param plot_stations: show individual station plots
    :param assoc: Associator object (used by run_many())
    :param log_level: console log level (choices = 'debug', 'verbose',
        'info', 'warning', 'error', 'critical'), default='info'
    :param plot_debug: show some debugging plots
    """
def run_many(self, start_date, end_date, plot_global=False,
    plot_stations=False, ignore_fails=False, log_level='info'):
    """
    Loops over events in a date range

    :param start_date: "YYYYMMDD" or "YYYYMMDDHHMM" of first data to process
    :param end_date: "YYYYMMDD" of last data to process
    :param plot_global: show global and overall pick plots
    :param plot_stations: show individual station plots
    :param ignore_fails: keep going if one run fails
    :param log_level: console log level (choices = 'debug', 'verbose',
                      'info', 'warning', 'error', 'critical'), default='info'        
    """

Parameter and response files

Are documented here

To get the same results as with the old Matlab program, set the following values:

  • set association:method to "arrival_time"
  • set station_parameters:{type}:max_candidates to 2
  • set SNR:threshold_parameter to 0.2
  • set SNR:max_threshold_crossings to 5
  • set global_window:max_candidates to 2

Event amplitudes

Event amplitudes calculations need accurate instrument responses. The instrument response filename(s) are input in the parameter file. If you have as stationxml file, you can make a pspicker_compatible json_pz file like this:

paz = PAZ.read_stationxml(filename, channel=xxx[, station=xxxx])
paz.write_json_pz (ps_filename)

If you have a response in another format that you can read in using obspy, you can output it to a pspicker-compatible json_pz file like this:

paz = PAZ.from_obspy_response(resp)
paz.write_json_pz(pz_filename)

In both cases, you can look at the response using paz.plot(min_freq=xxx), or you could compare it to the obspy_response using:

fig = resp.plot(min_freq=xxx, label='obspy', show=False)
paz = PAZ.from_obspy_response(resp)
paz.plot(min_freq=xxx, axes=fig.axes, label='PAZ', sym='g.')

To Do

  • Add event location-based acceptance of solitary P- and S- candidates
  • In P-, S- and P-S clustering stage, allow unused candidates to be substituted for rejected picks
  • Dedicated To Do file

Also see the profiling file

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pspicker-0.5.1.tar.gz (63.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pspicker-0.5.1-py3-none-any.whl (70.9 kB view details)

Uploaded Python 3

File details

Details for the file pspicker-0.5.1.tar.gz.

File metadata

  • Download URL: pspicker-0.5.1.tar.gz
  • Upload date:
  • Size: 63.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.8.5

File hashes

Hashes for pspicker-0.5.1.tar.gz
Algorithm Hash digest
SHA256 5c4ef41cc317180838bdf916ae73c13023a602c1edf47da7e949621ab4c483d1
MD5 cc80dba0441a6f0ab6e39880b3438215
BLAKE2b-256 2671b3cf9726e1d2a512d3dd29cfd297bcbbcba6140cad5136bbb18c10bda037

See more details on using hashes here.

File details

Details for the file pspicker-0.5.1-py3-none-any.whl.

File metadata

  • Download URL: pspicker-0.5.1-py3-none-any.whl
  • Upload date:
  • Size: 70.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.8.5

File hashes

Hashes for pspicker-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f3374da3c45e18cd5eac1f3be1edec5d687a7e0f02f3e02556183318d058c601
MD5 463574f86f8449f236e673fdbeb0fceb
BLAKE2b-256 73faec92e7dfda6e1a45d6db6d1da2d6c48b5d35a3bb269958272054dceaefe5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page