Skip to main content

dastools: Tools to work with data generated by DAS systems

Project description

https://img.shields.io/pypi/v/dastools.svg https://img.shields.io/pypi/pyversions/dastools.svg https://img.shields.io/pypi/format/dastools.svg https://img.shields.io/pypi/status/dastools.svg https://git.gfz-potsdam.de/javier/dastools/badges/master/coverage.svg

Tools to work with data generated by DAS systems.

Overview

This package provides a set of tools to read, manipulate and convert seismic waveforms generated by DAS systems. In particular, the ones generated by Silixa (TDMs format) and OptoDAS/Alcatel (HDF5).

dasmetadata

dasws

dasws is a stand-alone implementation of the FDSN Dataselect web service, which is able to serve miniSEED data extracted from a folder with DAS files.

A typical help message from dasws looks like the following:

% dasws -h
usage: dasws [-h] [-mc] [-l {DEBUG,WARNING,INFO,DEBUG}]

dasws is an FDSN Dataselect implementation to read DAS files

optional arguments:
  -h, --help            show this help message and exit
  -mc, --minimalconfig  Generate a minimal configuration file.
  -l {DEBUG,WARNING,INFO,DEBUG}, --log {DEBUG,WARNING,INFO,DEBUG}
                        Increase the verbosity level.

The “mc” switch creates a config file, which should be placed in the same folder as the DAS files. The file includes all needed options and configuration variables which will be read by the software before being able to serve the data. The user is expected to edit this file and provide the basic information about the DAS experiment before running the service.

One can see below a typical config file.

[General]
experiment = default
loglevel = INFO

[NSLC]
network = XX
location =
channel = HSF

The “experiment” variable refers to the first part of the filenames in the folder. For instance, in the example above all files will start with “default” and then a timestamp including the timezone (or UTC) will follow.

$ ls -l
total 1577352
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093735.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093805.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093835.409.tdms

The variables “network”, “location” and “channel” will be fixed to define the N.S.L.C code. Only the station will vary and it will always be a number referring to the stream number for the experiment. From the example above, the only valid code would be “XX.00001..HSF”, “XX.00002..HSF”, …, “XX.00123..HSF” up to all available streams.

Running the service

To run the service you should “cd” into the folder with the DAS files and make sure that there is a file called “dasws.cfg” with its variables properly configured. Then, you can simply call the program, which will start and run as a daemon. The service will listen to all requests in port 7000.

Web service methods

  • query: The six required parameters “net”, “sta”, “loc”, “cha”, “start”, and “end” are supported including their aliases. Errors are returned as specified in the standard.

  • version: returns the version number in text/plain format

  • application.wadl: returns details about implemented and supported options and parameters

  • queryauth: NOT implemented yet!

dasconv

This utility lets you convert and manipulate seismic waveforms in TDMs format and export them into MiniSEED.

Data acquired from experiments with DAS systems are usually stored in one folder. Files within this folder have names indicating the experiment and the start time of the waveforms saved. An example of the files generated in a test experiment is shown below.

$ ls -l
total 1577352
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093735.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:38 default_UTC_20190508_093805.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093835.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:39 default_UTC_20190508_093905.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:40 default_UTC_20190508_093935.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:40 default_UTC_20190508_094005.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:41 default_UTC_20190508_094035.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:41 default_UTC_20190508_094105.409.tdms
-rwxrwxrwx  1 user  staff   49965056 May  8 09:42 default_UTC_20190508_094135.409.tdms

There, default is the name of the experiment and the rest is the start time with the following format: experiment_TZ_YYYYMMDD_HHmmss.fff.tdms.

dasconv provides also a TDMS class which needs to receive one mandatory parameter to be instantiated, filename, which is actually the experiment name and how all file names in the containing folder start with. A detailed explanation on how to use it on your own programs can be found in the documentation.

A typical help message from dasconv looks like the following:

usage: dasconv [-h] [-l {CRITICAL,ERROR,WARNING,INFO,DEBUG}] [--logout LOGOUT] [-d DIRECTORY] [--start START] [--end END] [--chstart CHSTART] [--chstop CHSTOP] [--chstep CHSTEP]
               [--decimate {1,5}] [-N NETWORK] [-C CHANNEL] [-f {OptoDAS,TDMS}] [-p {1,2,4,8,16,32}] [-o {SDS,StreamBased,StreamBasedHour}] [-V]
               filename

Read, manipulate and convert seismic waveforms generated by a DAS system.

positional arguments:
  filename              Experiment to read and process. It is usually the first part of the filenames.

options:
  -h, --help            show this help message and exit
  -l {CRITICAL,ERROR,WARNING,INFO,DEBUG}, --loglevel {CRITICAL,ERROR,WARNING,INFO,DEBUG}
                        Verbosity in the output (default: INFO)
  --logout LOGOUT       Name of the log file (default: output.log)
  -d DIRECTORY, --directory DIRECTORY
                        Directory where files are located (default: ".")
  --start START, --starttime START
                        Start of the selected time window. Format: 2019-02-01T00:01:02.123456Z
  --end END, --endtime END
                        End of the selected time window. Format: 2019-02-01T00:01:02.123456Z
  --chstart CHSTART     First channel to export (default: 0)
  --chstop CHSTOP       Last channel to export (default: last channel available)
  --chstep CHSTEP       Step between channels in the selection (default: 1)
  --decimate {1,5}      Factor by which the sampling rate is lowered by decimation (default: 1)
  -N NETWORK, --network NETWORK
                        Network code to store in the miniseed header (default: "XX")
  -C CHANNEL, --channel CHANNEL
                        Channel code to store in the miniseed header (default: "HSF")
  -f {OptoDAS,TDMS}, --inputfmt {OptoDAS,TDMS}
                        Format of the input files (default: auto detect)
  -p {1,2,4,8,16,32}, --processes {1,2,4,8,16,32}
                        Number of threads to spawn when parallelizing the conversion (default: 1)
  -o {SDS,StreamBased,StreamBasedHour}, --outstruct {SDS,StreamBased,StreamBasedHour}
                        Structure to be used when saving the converted data. SDS: SeisComP Data Structure; StreamBased: one file per stream; StreamBasedHour: one file per stream per hour.
                        Available options are [SDS, StreamBased, StreamBasedHour] (default: StreamBased)
  -V, --version         show program's version number and exit

Examples

Export waveforms from channels 800, 802 and 804 starting at 2019-05-08T09:37:35.409000 until 2019-05-08T09:38:05.400000. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --start "2019-05-08T09:37:35.409000" --end "2019-05-08T09:38:05.400000" --chstart 800 --chstop 805 --chstep 2 default

Export waveforms from channels 0 and 1 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" --chstart 0 --chstop 1 default

Export waveforms from channels 0 to 4 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).

dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" --chstart 0 --chstop 4 --decimate 5 default

Acknowledgments

This work was done as part of the EOSC-Pillar project, which has received funding from the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 857650, as well as the RISE project, also supported by the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 821115.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dastools-0.9.6.post2.tar.gz (112.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dastools-0.9.6.post2-py3-none-any.whl (137.2 kB view details)

Uploaded Python 3

File details

Details for the file dastools-0.9.6.post2.tar.gz.

File metadata

  • Download URL: dastools-0.9.6.post2.tar.gz
  • Upload date:
  • Size: 112.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for dastools-0.9.6.post2.tar.gz
Algorithm Hash digest
SHA256 0573cc159f6c94f84b2fbceed96a830cb89e52df377222912b5853b7b5a2327f
MD5 dcb9e9642a0aa9ae3673c437435b60c7
BLAKE2b-256 1c4b0451d82bb19ed17a6c1390be4ffe168f648ec81560e3bb88182e3c7d7cd7

See more details on using hashes here.

File details

Details for the file dastools-0.9.6.post2-py3-none-any.whl.

File metadata

  • Download URL: dastools-0.9.6.post2-py3-none-any.whl
  • Upload date:
  • Size: 137.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for dastools-0.9.6.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 cbd3e1ad62cd4d16701ce0d2ae6371783a864056d3e2120409528a49e03eb772
MD5 40d62db0f91a6df452a271bb7d7f3f91
BLAKE2b-256 76c8b9926a53614abfbe0bb51e820b2577e3cc9973abbdd0e039d0476cf53446

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page