dastools: Tools to work with data generated by DAS systems
Project description
Tools to work with data generated by DAS systems.
Overview
This package provides a set of tools to read, manipulate and convert seismic waveforms generated by DAS systems. In particular, the ones generated by Silixa (TDMs format) and OptoDAS/Alcatel (HDF5).
dasconv
This utility lets you convert and manipulate seismic waveforms in TDMs format and export them into MiniSEED.
Data acquired from experiments with DAS systems are usually stored in one folder. Files within this folder have names indicating the experiment and the start time of the waveforms saved. An example of the files generated in a test experiment is shown below.
$ ls -l total 1577352 -rwxrwxrwx 1 user staff 49965056 May 8 09:38 default_UTC_20190508_093735.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:38 default_UTC_20190508_093805.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:39 default_UTC_20190508_093835.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:39 default_UTC_20190508_093905.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:40 default_UTC_20190508_093935.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:40 default_UTC_20190508_094005.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:41 default_UTC_20190508_094035.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:41 default_UTC_20190508_094105.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:42 default_UTC_20190508_094135.409.tdms
There, default is the name of the experiment and the rest is the start time with the following format: experiment_TZ_YYYYMMDD_HHmmss.fff.tdms.
dasconv provides also a TDMS class which needs to receive one mandatory parameter to be instantiated, filename, which is actually the experiment name and how all file names in the containing folder start with. A detailed explanation on how to use it on your own programs can be found in the documentation.
A typical help message from dasconv looks like the following:
usage: dasconv [-h] [-l {CRITICAL,ERROR,WARNING,INFO,DEBUG}] [--logout LOGOUT] [-d DIRECTORY] [--start START] [--end END] [--chstart CHSTART] [--chstop CHSTOP] [--chstep CHSTEP] [--decimate {1,5}] [-N NETWORK] [-C CHANNEL] [-f {OptoDAS,TDMS}] [-p {1,2,4,8,16,32}] [-o {SDS,StreamBased,StreamBasedHour}] [-V] filename Read, manipulate and convert seismic waveforms generated by a DAS system. positional arguments: filename Experiment to read and process. It is usually the first part of the filenames. options: -h, --help show this help message and exit -l {CRITICAL,ERROR,WARNING,INFO,DEBUG}, --loglevel {CRITICAL,ERROR,WARNING,INFO,DEBUG} Verbosity in the output (default: INFO) --logout LOGOUT Name of the log file (default: output.log) -d DIRECTORY, --directory DIRECTORY Directory where files are located (default: ".") --start START, --starttime START Start of the selected time window. Format: 2019-02-01T00:01:02.123456Z --end END, --endtime END End of the selected time window. Format: 2019-02-01T00:01:02.123456Z --chstart CHSTART First channel to export (default: 0) --chstop CHSTOP Last channel to export (default: last channel available) --chstep CHSTEP Step between channels in the selection (default: 1) --decimate {1,5} Factor by which the sampling rate is lowered by decimation (default: 1) -N NETWORK, --network NETWORK Network code to store in the miniseed header (default: "XX") -C CHANNEL, --channel CHANNEL Channel code to store in the miniseed header (default: "HSF") -f {OptoDAS,TDMS}, --inputfmt {OptoDAS,TDMS} Format of the input files (default: auto detect) -p {1,2,4,8,16,32}, --processes {1,2,4,8,16,32} Number of threads to spawn when parallelizing the conversion (default: 1) -o {SDS,StreamBased,StreamBasedHour}, --outstruct {SDS,StreamBased,StreamBasedHour} Structure to be used when saving the converted data. SDS: SeisComP Data Structure; StreamBased: one file per stream; StreamBasedHour: one file per stream per hour. Available options are [SDS, StreamBased, StreamBasedHour] (default: StreamBased) -V, --version show program's version number and exit
Examples
Export waveforms from channels 800, 802 and 804 starting at 2019-05-08T09:37:35.409000 until 2019-05-08T09:38:05.400000. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).
dasconv -d /home/user/test/ --start "2019-05-08T09:37:35.409000" --end "2019-05-08T09:38:05.400000" --chstart 800 --chstop 805 --chstep 2 default
Export waveforms from channels 0 and 1 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).
dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" --chstart 0 --chstop 1 default
Export waveforms from channels 0 to 4 from the beginning of the measurements until 2019-05-08T09:32:15. The waveforms will be exported to MiniSEED format after being decimated by a factor of 5 (e.g. from 1000Hz to 200Hz).
dasconv -d /home/user/test/ --endtime "2019-05-08T09:32:15" --chstart 0 --chstop 4 --decimate 5 default
dasws (experimental)
dasws is a stand-alone implementation of the FDSN Dataselect web service, which is able to serve miniSEED data extracted from a folder with DAS files.
A typical help message from dasws looks like the following:
% dasws -h usage: dasws [-h] [-mc] [-l {DEBUG,WARNING,INFO,DEBUG}] dasws is an FDSN Dataselect implementation to read DAS files optional arguments: -h, --help show this help message and exit -mc, --minimalconfig Generate a minimal configuration file. -l {DEBUG,WARNING,INFO,DEBUG}, --log {DEBUG,WARNING,INFO,DEBUG} Increase the verbosity level.
The “mc” switch creates a config file, which should be placed in the same folder as the DAS files. The file includes all needed options and configuration variables which will be read by the software before being able to serve the data. The user is expected to edit this file and provide the basic information about the DAS experiment before running the service.
One can see below a typical config file.
[General] experiment = default loglevel = INFO [NSLC] network = XX location = channel = HSF
The “experiment” variable refers to the first part of the filenames in the folder. For instance, in the example above all files will start with “default” and then a timestamp including the timezone (or UTC) will follow.
$ ls -l total 1577352 -rwxrwxrwx 1 user staff 49965056 May 8 09:38 default_UTC_20190508_093735.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:38 default_UTC_20190508_093805.409.tdms -rwxrwxrwx 1 user staff 49965056 May 8 09:39 default_UTC_20190508_093835.409.tdms
The variables “network”, “location” and “channel” will be fixed to define the N.S.L.C code. Only the station will vary and it will always be a number referring to the stream number for the experiment. From the example above, the only valid code would be “XX.00001..HSF”, “XX.00002..HSF”, …, “XX.00123..HSF” up to all available streams.
Running the service
To run the service you should “cd” into the folder with the DAS files and make sure that there is a file called “dasws.cfg” with its variables properly configured. Then, you can simply call the program, which will start and run as a daemon. The service will listen to all requests in port 7000.
Web service methods
query: The six required parameters “net”, “sta”, “loc”, “cha”, “start”, and “end” are supported including their aliases. Errors are returned as specified in the standard.
version: returns the version number in text/plain format
application.wadl: returns details about implemented and supported options and parameters
queryauth: NOT implemented yet!
Acknowledgments
This work was done as part of the EOSC-Pillar project, which has received funding from the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 857650, as well as the RISE project, also supported by the European Union’s Horizon 2020 research and innovation program under Grant Agreement Number 821115.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file dastools-0.9.4.tar.gz
.
File metadata
- Download URL: dastools-0.9.4.tar.gz
- Upload date:
- Size: 95.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
f48b1ead8a42f88b0bd476de2c9be3bc116555f0f36f7109d8ec107531d85f4d
|
|
MD5 |
5b187a399c7e1a97f652ea2eca5cd8c8
|
|
BLAKE2b-256 |
06eec1c2260a5e04d1443037ee67d273a7ebe2889b31b4122ff8638637ced420
|