Skip to main content

Tools for documenting ocean bottom seismometer experiments and creating metadata

Project description

A system for for creating FDSN-standard data and metadata for ocean bottom seismometers using standardized, easy-to-read information files

Current goal

To come out with a first version (v1.x) schema for the information files. We would like input from seismologists and ocean bottom seismometer manufacturers/operators about what information/capabilities are missing. Existing questions can be found/modified in QUESTIONS_infofiles.rst

Information files

The system is based on “information files” in JSON or YAML format, which can be used to create StationXML files and to record data preparation steps. The files duplicate the StationXML format where possible, deviating where necessary to reduce redundancy and to add functionality (see “information files”)

There are 2 main file types:

Name

Description

Filled by

When filled

network

Deployed stations, their instruments and parameters

OBS operator

after a campaign

instrumentation

Instrument and

OBS operator

new/changed

or

component

and/or

instruments,

instrument_components

descriptions

component manufacturers

components, or calibrations

Each of these files can have subfiles referenced using the JSONref protocol. This allows, for example, one to make response and filter files to avoid repetition.

In principal (not yet implemented), the instrument_components files could be replaced by RESP files or references to the NRL (Nominal Response Library), but obsinfo provides a simpler and more standards-compliant way to specify the components, and it can automatically calculate response sensitivities based on gains and filter characteristics. Instrumentation files should also be able to make RESP-files and NRL directories (not implemented).

A third type of Information File is the campaign file, which allows the chief scientist to specify all of the stations and OBS operators used for a given experiment, as well as periods of data that they would like to see in order to validate the data preparation. For the moment, obsinfo doesn’t do anything with these files, but can validate them.

Python code

The package name is obsinfo

obsinfo.network, obsinfo.instrumentation and obsinfo.instrument_components contain code to process the corresponding information files. obsinfo.misc contains code common to the above modules

obspy.addons contains modules specific to proprietary systems:

  • obspy.addons.LCHEAPO creates scripts to convert LCHEAPO OBS data to miniSEED using the lc2ms software

  • obspy.addons.SDPCHAIN creates scripts to convert basic miniSEED data to OBS-aware miniSEED using the SDPCHAIN software suite

  • obspy.addons.OCA creates JSON metadata in a format used by the Observatoire de la Cote d’Azur to create StationXML

Executables

The following command-line executables perform useful tasks:

  • obsinfo-validate: validates an information file against its schema

  • obsinfo-print: prints a summary of an information file

  • obsinfo-makeSTATIONXML: generates StationXML files from a network + instrumentation information files

The following command-line executables make scripts to run specific data conversion software:

  • obsinfo-make_LCHEAPO_scripts: Makes scripts to convert LCHEAPO data to miniSEED

  • obsinfo-make_SDPCHAIN_scripts: Makes scripts to drift correct miniSEED data and package them for FDSN-compatible data centers

Other subdirectories

obsinfo/data/

Contains information used by the program:

data/schema contains JSON Schema for each file type.

obsinfo/_examples/

Contains example information files and scripts:

  • _examples/Information_Files contains a complete set of information files

    • .../campaigns contains network and campaign files

    • .../instrumentation contains instrumentation, instrument_components, response and filter files.

  • _examples/scripts contains bash scripts to look at and manipulate these files using the executables. Running these scripts is a good way to make sure your installation works, looking at the files they work on is a good way to start making your own information files.

Comments

We use standard MAJOR.MINOR.MAINTENANCE version numbering but, while the system is in prerelease:

  • MAJOR==0

  • MINOR increments every time the information file structure changes in a non-backwards-compatible way

  • MAINTENANCE increments when the code changes or the file structure changes in a backwards-compatible way

More information

information files

TO DO

Use reStructuredText to modify this file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

obsinfo-0.110.tar.gz (126.8 kB view details)

Uploaded Source

Built Distribution

obsinfo-0.110-py3-none-any.whl (204.9 kB view details)

Uploaded Python 3

File details

Details for the file obsinfo-0.110.tar.gz.

File metadata

  • Download URL: obsinfo-0.110.tar.gz
  • Upload date:
  • Size: 126.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.19.5 CPython/3.6.9

File hashes

Hashes for obsinfo-0.110.tar.gz
Algorithm Hash digest
SHA256 e4bfde90854408b1cf9cb164d5d23ca8640a55a89ad8fb3f6bf3141312a88204
MD5 971e0258c6fd62edddffeb2ca3ea453e
BLAKE2b-256 f354c05c53a2adc3d7503391473df097aadeed8630cba5da8ca65785bc2ec5f6

See more details on using hashes here.

File details

Details for the file obsinfo-0.110-py3-none-any.whl.

File metadata

  • Download URL: obsinfo-0.110-py3-none-any.whl
  • Upload date:
  • Size: 204.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.19.5 CPython/3.6.9

File hashes

Hashes for obsinfo-0.110-py3-none-any.whl
Algorithm Hash digest
SHA256 96165d9d0766a6fe8ada119738b97f875d3bb59834baa97aa09663dc9cb492f9
MD5 8b2fd764c32518457d8ca44a33d41fba
BLAKE2b-256 a62fa08bef6be4608c157c377ab5a4f867dbfbcaaef585b8a906a03b217c7d95

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page