Skip to main content

dastools: Tools to work with data generated by DAS systems

Project description

https://img.shields.io/pypi/v/dastools.svg https://img.shields.io/pypi/pyversions/dastools.svg https://img.shields.io/pypi/format/dastools.svg https://img.shields.io/pypi/status/dastools.svg https://git.gfz-potsdam.de/javier/dastools/badges/master/coverage.svg

Tools to work with data generated by DAS systems.

Overview

This package provides a set of tools to read, manipulate and convert seismic waveforms generated by DAS systems. In particular, the ones generated by Silixa (TDMs format) and OptoDAS/Alcatel (HDF5).

dasconv

This utility lets you convert and manipulate seismic waveforms in TDMs format and export them into MiniSEED.

Data acquired from experiments with DAS systems are usually stored in one folder. Files within this folder have names indicating the experiment and the start time of the waveforms saved. An example of the files generated in a test experiment is shown below.

$ ls -l
total 1577352
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093735.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093805.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093835.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093905.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:40 default_UTC_20190508_093935.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:40 default_UTC_20190508_094005.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:41 default_UTC_20190508_094035.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:41 default_UTC_20190508_094105.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:42 default_UTC_20190508_094135.409.tdms

There, default is the name of the experiment and the rest is the start time with the following format: experiment_TZ_YYYYMMDD_HHmmss.fff.tdms.

dasconv provides also a TDMS class which needs to receive one mandatory parameter to be instantiated, filename, which is actually the experiment name and how all file names in the containing folder start with. A detailed explanation on how to use it on your own programs can be found in the documentation.

A typical help message from dasconv looks like the following:

usage: dasconv [-h] [-l {CRITICAL,ERROR,WARNING,INFO,DEBUG}] [-d DIRECTORY]
               [--start START] [--end END] [--chstart CHSTART]
               [--chstop CHSTOP] [--chstep CHSTEP] [--decimate {1,5}]
               [-N NETWORK] [-C CHANNEL] [-o {SDS,StreamBased,StreamBasedHour}]
               [--metadata] [-V] filename

Read, manipulate and convert seismic waveforms generated by a DAS system.

positional arguments:
  filename              Experiment to read and process. It is usually the
                        first part of the filenames.

optional arguments:
  -h, --help            show this help message and exit
  -l {CRITICAL,ERROR,WARNING,INFO,DEBUG}, --loglevel {CRITICAL,ERROR,WARNING,INFO,DEBUG}
                        Verbosity in the output.
  -d DIRECTORY, --directory DIRECTORY
                        Directory where files are located (default: ".")
  --start START, --starttime START
                        Start of the selected time window. Format:
                        2019-02-01T00:01:02.123456Z
  --end END, --endtime END
                        End of the selected time window. Format:
                        2019-02-01T00:01:02.123456Z
  --chstart CHSTART     First channel to export
  --chstop CHSTOP       Last channel to export
  --chstep CHSTEP       Step between channels in the selection
  --decimate {1,5}      Factor by which the sampling rate is lowered by
                        decimation.
  -N NETWORK, --network NETWORK
                        Network code to store in the miniseed header (default: "XX")
  -C CHANNEL, --channel CHANNEL
                        Channel code to store in the miniseed header (default: "FSF")
  -o {SDS,StreamBased,StreamBasedHour}, --outstruct {SDS,StreamBased,StreamBasedHour}
                        Available options are [SDS, StreamBased, StreamBasedHour]
  --metadata            Read and display the metadata from the TDMS files
  -V, --version         show program's version number and exit

Examples

Export waveforms from channels 800, 802 and 804 starting at 2019-05-08T09:37:35.409000 until 2019-05-08T09:38:05.400000. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --start "2019-05-08T09:37:35.409000" --end "2019-05-08T09:38:05.400000" default --chstart 800 --chstop 805 --chstep 2

Export waveforms from channels 0 and 1 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" default --chstart 0 --chstop 1

Export waveforms from channels 0 to 4 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" default --chstart 0 --chstop 4 --decimate 5

tdmsws (experimental)

tdmsws is a stand-alone implementation of the FDSN Dataselect web service, which is able to serve miniSEED data extracted from a folder with TDMS files.

A typical help message from tdmsws looks like the following:

% tdmsws -h
usage: tdmsws [-h] [-mc] [-l {DEBUG,WARNING,INFO,DEBUG}]

tdmsws is an FDSN Dataselect implementation to read TDMS files

optional arguments:
  -h, --help            show this help message and exit
  -mc, --minimalconfig  Generate a minimal configuration file.
  -l {DEBUG,WARNING,INFO,DEBUG}, --log {DEBUG,WARNING,INFO,DEBUG}
                        Increase the verbosity level.

The “mc” switch creates a config file, which should be placed in the same folder as the TDMS files. The file includes all needed options and configuration variables which will be read by the software before being able to serve the data. The user is expected to edit this file and provide the basic information about the DAS experiment before running the service.

One can see below a typical config file.

[General]
experiment = default
loglevel = INFO

[NSLC]
network = XX
location =
channel = FSF

The “experiment” variable refers to the first part of the filenames in the folder. For instance, in the example above all files will start with “default” and then a timestamp including the timezone (or UTC) will follow.

$ ls -l
total 1577352
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093735.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093805.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093835.409.tdms

The variables “network”, “location” and “channel” will be fixed to define the N.S.L.C code. Only the station will vary and it will always be a number referring to the stream number for the experiment. From the example above, the only valid code would be “XX.00001..FSF”, “XX.00002..FSF”, …, “XX.00123..FSF” up to all available streams.

Running the service

To run the service you should “cd” into the folder with the TDMS files and make sure that there is a file called “tdmsws.cfg” with its variables properly configured. Then, you can simply call the program, which will start and run as a daemon. The service will listen to all requests in port 7000.

Web service methods

  • query: The six required parameters “net”, “sta”, “loc”, “cha”, “start”, and “end” are supported including their aliases. Errors are returned as specified in the standard.

  • version: returns the version number in text/plain format

  • application.wadl: returns details about implemented and supported options and parameters

  • queryauth: NOT implemented yet!

Acknowledgments

This work was done as part of the EOSC-Pillar project, which has received funding from the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 857650, as well as the RISE project, also supported by the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 821115.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dastools-0.9.tar.gz (63.7 kB view details)

Uploaded Source

File details

Details for the file dastools-0.9.tar.gz.

File metadata

  • Download URL: dastools-0.9.tar.gz
  • Upload date:
  • Size: 63.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.28.1 setuptools/67.0.0 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.9.0

File hashes

Hashes for dastools-0.9.tar.gz
Algorithm Hash digest
SHA256 d5ec30b49f53073c03ea9059b16010ad608b26c1433c3efc1a97ecda9666e182
MD5 bbdc65b1ad9641d1ce0090f0bd3a412a
BLAKE2b-256 3f705e238527dd0b4136b0cef9fd84f3ba73d45945e41985d4336244629d71f0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page