Skip to main content

Client library for interacting with the QuakeSaver sensor fleet.

Project description

QuakeSaver Client

PyPI PyPI - Python Version Test Code style: black

This is the client for the QuakeSaver Sensor services.

You can find the documentation here.

Getting Started

Setting up the client

EMAIL and PASSWORD correspond to the credentials you use to log in at https://network.quakesaver.net.

from quakesaver_client import QSClient

EMAIL = "user@yourorganisation.net"
PASSWORD = "!verstrongpassword1"

client = QSClient(email=EMAIL, password=PASSWORD)

Full example script

Authenticate against the quakesaver server and download raw, as well as processed data.

Please note, that for security reasons each login session is only valid for 15 minutes. Thus, the client is not designed for long-term connections but for repeated queries.

"""Example script for quakesaver_client usage."""

import sys
from datetime import datetime, timedelta
from pprint import pp

import obspy
from obspy import Stream

from quakesaver_client import QSClient
from quakesaver_client.models.data_product_query import DataProductQuery
from quakesaver_client.models.measurement import MeasurementQuery

EMAIL = "user@yourorganisation.net"
PASSWORD = "!verstrongpassword1"
DATA_PATH = "./data"

client = QSClient(email=EMAIL, password=PASSWORD)

# Get a list of all available sensor IDs:
sensor_ids = client.get_sensor_ids()
pp(sensor_ids)

if len(sensor_ids) == 0:
    print("No sensors available")
    sys.exit()

# For demonstration, we use the first sensor in the list
sensor_uid_to_get = sensor_ids[0]

# Get the sensor from the client
sensor = client.get_sensor(sensor_uid_to_get)
pp(sensor.dict())

# Queries such as waveforms, station metadata and measurements (data products calculated
# on the sensor)
# require that you select a time window. We use that last 5 hours of data
end_time = datetime.utcnow()
start_time = end_time - timedelta(hours=5)

# Query various Measurements. In this case we calculate a rolling `mean` over 10 minutes
# time windows.
# Other `aggregators` are:
#  * None (default)
#  * max
#  * min
query = MeasurementQuery(
    start_time=start_time,
    end_time=end_time,
    interval=timedelta(minutes=10),
    aggregator="mean",
)
result = sensor.get_jma_intensity(query)
print(result)
result = sensor.get_peak_ground_acceleration(query)
print(result)
result = sensor.get_spectral_intensity(query)
print(result)
result = sensor.get_rms_offset(query)
print(result)

# Query various Data Products. You can only get 100 results at once, which is why there
# are limit and skip values. You can get data products from a specific time frame, by
# specifying start and end times.
end_time = datetime.utcnow()
start_time = end_time - timedelta(hours=5)
query = DataProductQuery(
    start_time=start_time,
    end_time=end_time,
    limit=100,
    skip=0,
)
result = sensor.get_event_records(query)
print(result)
result = sensor.get_hv_spectra(query)
print(result)
result = sensor.get_noise_autocorrelations(query)
print(result)

# Download station meta data as StationXML and store them in a local directory.
file_path = sensor.get_stationxml(
    starttime=start_time,
    endtime=end_time,
    level="response",
    location_to_store=DATA_PATH,
)
with open(file_path, "r") as file:
    print(file.read())

# Download raw full waveforms from the sensor. Note that you can only query what is in
# the sensor's ringbuffer (usually the last ~ 48 hours).
file_path = sensor.get_waveform_data(
    start_time=start_time, end_time=end_time, location_to_store=DATA_PATH
)

# Read the file into obspy for further processing...
stream: Stream = obspy.read(file_path)
for trace in stream.traces:
    print(trace.stats)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quakesaver_client-1.1.1.tar.gz (21.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quakesaver_client-1.1.1-py3-none-any.whl (23.2 kB view details)

Uploaded Python 3

File details

Details for the file quakesaver_client-1.1.1.tar.gz.

File metadata

  • Download URL: quakesaver_client-1.1.1.tar.gz
  • Upload date:
  • Size: 21.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.0-1035-azure

File hashes

Hashes for quakesaver_client-1.1.1.tar.gz
Algorithm Hash digest
SHA256 d699b026838b590917d05b17abf748ab37720be38ef6ddd7afefaa3026340e60
MD5 9966fa2143265a35bfe735f110245df7
BLAKE2b-256 fd864549be8997c848bc897aae0c27e0af973306ae48afac822a0b56cd07c8e4

See more details on using hashes here.

File details

Details for the file quakesaver_client-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: quakesaver_client-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 23.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.0-1035-azure

File hashes

Hashes for quakesaver_client-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a287c525ab8fb8eb4d8815da699a8ab1e199fffff7741e30e6f2865ba466162c
MD5 757406df75323b8795d58c7cdb986caf
BLAKE2b-256 6ab9ef4bb19454e26652278df85a2a1359bef3b66d1341dedc40a6d0ecf6e6c2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page