Skip to main content

The seismological machine learning benchmark collection

Project description


PyPI - License GitHub Workflow Status Read the Docs PyPI Python 3.9 DOI

The Seismology Benchmark collection (SeisBench) is an open-source python toolbox for machine learning in seismology. It provides a unified API for accessing seismic datasets and both training and applying machine learning algorithms to seismic data. SeisBench has been built to reduce the overhead when applying or developing machine learning techniques for seismological tasks.

Getting started

SeisBench offers three core modules, data, models, and generate. data provides access to benchmark datasets and offers functionality for loading datasets. models offers a collection of machine learning models for seismology. You can easily create models, load pretrained models or train models on any dataset. generate contains tools for building data generation pipelines. They bridge the gap between data and models.

The easiest way of getting started is through our colab notebooks.

Examples
Dataset basics Open In Colab
Model API Open In Colab
Generator Pipelines Open In Colab
Applied picking Open In Colab
Using DeepDenoiser Open In Colab
Depth phases and earthquake depth Open In Colab
Training PhaseNet (advanced) Open In Colab
Creating a dataset (advanced) Open In Colab
Building an event catalog with GaMMA (advanced) Open In Colab
Building an event catalog with PyOcto (advanced) Open In Colab

Alternatively, you can clone the repository and run the same examples locally.

For more detailed information on Seisbench check out the SeisBench documentation.

Installation

SeisBench can be installed in two ways. In both cases, you might consider installing SeisBench in a virtual environment, for example using conda.

The recommended way is installation through pip. Simply run:

pip install seisbench

Alternatively, you can install the latest version from source. For this approach, clone the repository, switch to the repository root and run:

pip install .

which will install SeisBench in your current python environment.

CPU only installation

SeisBench is built on pytorch, which in turn runs on CUDA for GPU acceleration. Sometimes, it might be preferable to install pytorch without CUDA, for example, because CUDA will not be used and the CUDA binaries are rather large. To install such a pure CPU version, the easiest way is to follow a two-step installation. First, install pytorch in a pure CPU version as explained here. Second, install SeisBench the regular way through pip. Example instructions would be:

pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
pip install seisbench

Contributing

There are many ways to contribute to SeisBench and we are always looking forward to your contributions. Check out the contribution guidelines for details on how to contribute.

Known issues

  • Some institutions and internet providers are blocking access to our data and model repository, as it is running on a non-standard port (2880). This usually manifests in timeouts when trying to download data or model weights. To verify the issue, try accessing https://hifis-storage.desy.de:2880/ directly from the same machine. As a mitigation, you can use our backup repository. Just run seisbench.use_backup_repository(). Please note that the backup repository will usually show lower download speeds. We recommend contacting your network administrator to allow outgoing access to TCP port 2880 on our server as a higher performance solution.
  • We've recently changed the URL of the SeisBench repository. To use the new URL update to SeisBench 0.4.1. It this is not possible, you can use the following commands within your runtime to update the URL manually:
    import seisbench
    from urllib.parse import urljoin
    
    seisbench.remote_root = "https://hifis-storage.desy.de:2880/Helmholtz/HelmholtzAI/SeisBench/"
    seisbench.remote_data_root = urljoin(seisbench.remote_root, "datasets/")
    seisbench.remote_model_root = urljoin(seisbench.remote_root, "models/v3/")
    
  • On the Apple M1 and M2 chips, pytorch seems to not always work when installed directly within pip install seisbench. As a workaround, follow the instructions at (https://pytorch.org/) to install pytorch and then install SeisBench as usual through pip.
  • EQTransformer model weights "original" in version 1 and 2 are incompatible with SeisBench >=0.2.3. Simply use from_pretrained("original", version="3") or from_pretrained("original", update=True). The weights will not differ in their predictions.

References

Reference publications for SeisBench:




Acknowledgement

The initial version of SeisBench has been developed at GFZ Potsdam and KIT with funding from Helmholtz AI. The SeisBench repository is hosted by HIFIS - Helmholtz Federated IT Services.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

seisbench-0.8.2.tar.gz (19.8 MB view details)

Uploaded Source

Built Distribution

seisbench-0.8.2-py3-none-any.whl (172.2 kB view details)

Uploaded Python 3

File details

Details for the file seisbench-0.8.2.tar.gz.

File metadata

  • Download URL: seisbench-0.8.2.tar.gz
  • Upload date:
  • Size: 19.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for seisbench-0.8.2.tar.gz
Algorithm Hash digest
SHA256 436db0d5eb21033f45b54137396b9741458872a2d54b2c8815af7df086f55163
MD5 d7f1a01a924a25fd343792cba978452f
BLAKE2b-256 dbecd71a24fff469942ff8ec60327b7f7d30743d21a7cbee993519a73d940aa1

See more details on using hashes here.

File details

Details for the file seisbench-0.8.2-py3-none-any.whl.

File metadata

  • Download URL: seisbench-0.8.2-py3-none-any.whl
  • Upload date:
  • Size: 172.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for seisbench-0.8.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f58d207b50a3e6626cd3667f59dffec633978af07a92e51245783288cbda07ce
MD5 2188cbe95f9b38be3cdf98b43b33ba3c
BLAKE2b-256 f391fdc848aaf9b457609ddad120fd31d067beab24b4419cf00335eabc078ddb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page