Skip to main content

Client library for interacting with the QuakeSaver sensor fleet.

Project description

QuakeSaver Client

PyPI PyPI - Python Version Test Code style: black

This is the client for the QuakeSaver Sensor services.

You can find the documentation here.

Getting Started

Setting up the client

EMAIL and PASSWORD correspond to the credentials you use to log in at https://network.quakesaver.net.

from quakesaver_client import QSCloudClient

EMAIL = "user@yourorganisation.net"
PASSWORD = "!verstrongpassword1"

client = QSCloudClient(email=EMAIL, password=PASSWORD)

Example to stream from the cloud

Authenticate against the quakesaver server and download raw, as well as processed data.

Please note, that for security reasons each login session is only valid for 15 minutes. Thus, the client is not designed for long-term connections but for repeated queries.

"""Example script for quakesaver_client usage."""

import sys
from datetime import datetime, timedelta
from pprint import pp

import obspy
from obspy import Stream

from quakesaver_client import QSCloudClient
from quakesaver_client.models.data_product_query import DataProductQuery
from quakesaver_client.models.measurement import MeasurementQuery

EMAIL = "user@yourorganisation.net"
PASSWORD = "!verstrongpassword1"
DATA_PATH = "./data"

client = QSCloudClient(email=EMAIL, password=PASSWORD)

# Get a list of all available sensor IDs:
sensor_ids = client.get_sensor_ids()
pp(sensor_ids)

if len(sensor_ids) == 0:
    print("No sensors available")
    sys.exit()

# For demonstration, we use the first sensor in the list
sensor_uid_to_get = sensor_ids[0]

# Get the sensor from the client
sensor = client.get_sensor(sensor_uid_to_get)
pp(sensor.dict())

# Queries such as waveforms, station metadata and measurements (data products calculated
# on the sensor)
# require that you select a time window. We use that last 5 hours of data
end_time = datetime.utcnow()
start_time = end_time - timedelta(hours=5)

# Query various Measurements. In this case we calculate a rolling `mean` over 10 minutes
# time windows.
# Other `aggregators` are:
#  * None (default)
#  * max
#  * min
query = MeasurementQuery(
    start_time=start_time,
    end_time=end_time,
    interval=timedelta(minutes=10),
    aggregator="mean",
)
result = sensor.get_jma_intensity(query)
print(result)
result = sensor.get_peak_ground_acceleration(query)
print(result)
result = sensor.get_spectral_intensity(query)
print(result)
result = sensor.get_rms_offset(query)
print(result)

# Query various Data Products. You can only get 100 results at once, which is why there
# are limit and skip values. You can get data products from a specific time frame, by
# specifying start and end times.
end_time = datetime.utcnow()
start_time = end_time - timedelta(hours=5)
query = DataProductQuery(
    start_time=start_time,
    end_time=end_time,
    limit=100,
    skip=0,
)
result = sensor.get_event_records(query)
print(result)
result = sensor.get_hv_spectra(query)
print(result)
result = sensor.get_noise_autocorrelations(query)
print(result)

# Download station meta data as StationXML and store them in a local directory.
file_path = sensor.get_stationxml(
    starttime=start_time,
    endtime=end_time,
    level="response",
    location_to_store=DATA_PATH,
)
with open(file_path, "r") as file:
    print(file.read())

# Download raw full waveforms from the sensor. Note that you can only query what is in
# the sensor's ringbuffer (usually the last ~ 48 hours).
file_path = sensor.get_waveform_data(
    start_time=start_time, end_time=end_time, location_to_store=DATA_PATH
)

# Read the file into obspy for further processing...
stream: Stream = obspy.read(file_path)
for trace in stream.traces:
    print(trace.stats)

QSLocalClient Examples

Interact with sensors on your local network using the QSLocalClient.

Streaming Data

import asyncio
from quakesaver_client import QSLocalClient


async def run():
    client = QSLocalClient()

    sensor = client.get_sensor("qssensor.local")
    stream = sensor.get_waveform_stream()
    async for chunk in stream.start():
        print(chunk)


asyncio.run(run())

Downloading Data

Download the latest 10 minutes from a local sensor and write that into a file:

import datetime
from quakesaver_client import QSLocalClient

client = QSLocalClient()
sensor = client.get_sensor("qssensor.local")

tmax = datetime.datetime.utcnow()
tmin = tmax - datetime.timedelta(minutes=10)
file_path = sensor.get_waveform_data(tmin, tmax)
print(file_path)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quakesaver_client-1.2.0.tar.gz (24.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quakesaver_client-1.2.0-py3-none-any.whl (28.1 kB view details)

Uploaded Python 3

File details

Details for the file quakesaver_client-1.2.0.tar.gz.

File metadata

  • Download URL: quakesaver_client-1.2.0.tar.gz
  • Upload date:
  • Size: 24.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.0-1036-azure

File hashes

Hashes for quakesaver_client-1.2.0.tar.gz
Algorithm Hash digest
SHA256 2643c5a2529e011fc7d93a6e5a0a3cc2fe2575d1210e8463ee5a31a6c8bc0032
MD5 f12937094cad6afa3d0fad7f0a12eba8
BLAKE2b-256 64524bce2012eaf5313ab2df145dbc43e4722a66b144d4bccba530562a8ee4cc

See more details on using hashes here.

File details

Details for the file quakesaver_client-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: quakesaver_client-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 28.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.3 Linux/5.15.0-1036-azure

File hashes

Hashes for quakesaver_client-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b6a1c80f4a32a6a42710e6ac2c4c4472ec7b347d5ff45a4d906282571594eb85
MD5 df60b17a10034332ddb7abfe9eb7c62c
BLAKE2b-256 b29b5a2943db613de05ebbdc570e22652858c645d5b385f9c9021ff12c6698f9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page