Skip to main content

This package boosts a sparse matrix multiplication followed by selecting the top-n multiplication

Project description

sparse_dot_topn

MacOS Linux Windows License ruff

Release_date PyPi Downloads

sparse_dot_topn provides a fast way to performing a sparse matrix multiplication followed by top-n multiplication result selection.

Comparing very large feature vectors and picking the best matches, in practice often results in performing a sparse matrix multiplication followed by selecting the top-n multiplication results.

sparse_dot_topn provides a (parallelised) sparse matrix multiplication implementation that integrates selecting the top-n values, resulting in a significantly lower memory footprint and improved performance. On Apple M2 Pro over two 20k x 193k TF-IDF matrices sparse_dot_topn can be up to 6 times faster when retaining the top 10 values per row and utilising 8 cores. See the benchmark directory for details.

Usage

sp_matmul_topn supports {CSR, CSC, COO} matrices with {32, 64}bit {int, float} data. Note that COO and CSC inputs are converted to the CSR format and are therefore slower. Two options to further reduce memory requirements are threshold and density. Optionally, the values can be sorted such that the first column for a given row contains the largest value. Note that sp_matmul_topn(A, B, top_n=B.shape[1]) is equal to sp_matmul(A, B) and A.dot(B).

If you are migrating from v0.* please see the migration guide below for details.

import scipy.sparse as sparse
from sparse_dot_topn import sp_matmul, sp_matmul_topn

A = sparse.random(1000, 100, density=0.1, format="csr")
B = sparse.random(100, 2000, density=0.1, format="csr")

# Compute C and retain the top 10 values per row
C = sp_matmul_topn(A, B, top_n=10)

# or paralleslised matrix multiplication without top-n selection
C = sp_matmul(A, B, n_threads=2)
# or with top-n selection
C = sp_matmul_topn(A, B, top_n=10, n_threads=2)

# If you are only interested in values above a certain threshold
C = sp_matmul_topn(A, B, top_n=10, threshold=0.8)

# If you set the threshold we cannot easily determine the number of non-zero
# entries beforehand. Therefore, we allocate memory for `ceil(top_n * A.shap[0] * density)`
# non-zero entries. You can set the expected density to reduce the amount pre-allocated
# entries. Note that if we allocate too little an expensive copy(ies) will need to hapen.
C = sp_matmul_topn(A, B, top_n=10, threshold=0.8, density=0.1)

Installation

sparse_dot_topn provides wheels for CPython 3.8 to 3.12 for:

  • Windows (64bit)
  • Linux (64bit)
  • MacOS (x86 and ARM)
pip install sparse_dot_topn

sparse_dot_topn relies on a C++ extension for the computationally intensive multiplication routine. Note that the wheels vendor/ships OpenMP with the extension to provide parallelisation out-of-the-box. This may cause issues when used in combination with other libraries that ship OpenMP like PyTorch. If you run into any issues with OpenMP see INSTALLATION.md for help or run the function without specifying the n_threads argument.

Installing from source requires a C++17 compatible compiler. If you have a compiler available it is advised to install without the wheel as this enables architecture specific optimisations.

You can install from source using:

pip install sparse_dot_topn --no-binary sparse_dot_topn

Build configuration

sparse_dot_topn provides some configuration options when building from source. Building from source can enable architecture specific optimisations and is recommended for those that have a C++ compiler installed. See INSTALLATION.md for details.

Distributing the top-n multiplication of two large O(10M+) sparse matrices over a cluster

The top-n multiplication of two large O(10M+) sparse matrices can be broken down into smaller chunks. For example, one may want to split sparse matrices into matrices with just 1M rows, and do the the (top-n) multiplication of all those matrix pairs. Reasons to do this are to reduce the memory footprint of each pair, and to employ available distributed computing power.

The pairs can be distributed and calculated over a cluster (eg. we use a spark cluster). The resulting matrix-products are then zipped and stacked in order to reproduce the full matrix product.

Here's an example how to do this, where we are matching 1000 rows in sparse matrix A against 600 rows in sparse matrix B, and both A and B are split into chunks.

import numpy as np
import scipy.sparse as sparse
from sparse_dot_topn import sp_matmul_topn, zip_sp_matmul_topn

# 1a. Example matching 1000 rows in sparse matrix A against 600 rows in sparse matrix B.
A = sparse.random(1000, 2000, density=0.1, format="csr", dtype=np.float32, random_state=rng)
B = sparse.random(600, 2000, density=0.1, format="csr", dtype=np.float32, random_state=rng)

# 1b. Reference full matrix product with top-n
C_ref = sp_matmul_topn(A, B.T, top_n=10, threshold=0.01, sort=True)

# 2a. Split the sparse matrices. Here A is split into three parts, and B into five parts.
As = [A[i*200:(i+1)*200] for i in range(5)]
Bs = [B[:100], B[100:300], B[300:]]

# 2b. Perform the top-n multiplication of all sub-matrix pairs, here in a double loop.
# E.g. all sub-matrix pairs could be distributed over a cluster and multiplied there.
Cs = [[sp_matmul_topn(Aj, Bi.T, top_n=10, threshold=0.01, sort=True) for Bi in Bs] for Aj in As]

# 2c. top-n zipping of the C-matrices, done over the index of the B sub-matrices.
Czip = [zip_sp_matmul_topn(top_n=10, C_mats=Cis) for Cis in Cs]

# 2d. stacking over zipped C-matrices, done over the index of the A sub-matrices
# The resulting matrix C equals C_ref.
C = sparse.vstack(Czip, dtype=np.float32)

Migrating to v1.

sparse_dot_topn v1 is a significant change from v0.* with a new bindings and API. The new version adds support for CPython 3.12 and now supports both ints as well as floats. Internally we switched to a max-heap to collect the top-n values which significantly reduces memory-footprint. The former implementation had O(n_columns) complexity for the top-n selection where we now have O(top-n) complexity. awesome_cossim_topn has been deprecated and will be removed in a future version.

Users should switch to sp_matmul_topn which is largely compatible:

For example:

C = awesome_cossim_topn(A, B, ntop=10)

can be replicated using:

C = sp_matmul_topn(A, B, top_n=10, threshold=0.0, sort=True)

API changes

  1. ntop has been renamed to topn
  2. lower_bound has been renamed to threshold
  3. use_threads and n_jobs have been combined into n_threads
  4. return_best_ntop option has been removed
  5. test_nnz_max option has been removed
  6. B is auto-transposed when its shape is not compatible but its transpose is.

The output of return_best_ntop can be replicated with:

C = sp_matmul_topn(A, B, top_n=10)
best_ntop = np.diff(C.indptr).max()

Default changes

  1. threshold no longer 0.0 but disabled by default

This enables proper functioning for matrices that contain negative values. Additionally a different data-structure is used internally when collecting non-zero results that has a much lower memory-footprint than previously. This means that the effect of the threshold parameter on performance and memory requirements is negligible. If the threshold is None we pre-compute the number of non-zero entries, this can significantly reduce the required memory at a mild (~10%) performance penalty.

  1. sort = False, the result matrix is no longer sorted by default

The matrix is returned with the same column order as if not filtering of the top-n results has taken place. This means that when you set top_n equal to the number of columns of B you obtain the same result as normal multiplication, i.e. sp_matmul_topn(A, B, top_n=B.shape[1]) is equal to A.dot(B).

Contributing

Contributions are very welcome, please see CONTRIBUTING for details.

Contributors

This package was developed and is maintained by authors (previously) affiliated with ING Analytics Wholesale Banking Advanced Analytics. The original implementation was based on modified version of Scipy's CSR multiplication implementation. You can read about it in a blog (mirror) written by Zhe Sun.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sparse-dot-topn-1.1.4.tar.gz (44.6 kB view details)

Uploaded Source

Built Distributions

sparse_dot_topn-1.1.4-cp312-abi3-win_amd64.whl (419.0 kB view details)

Uploaded CPython 3.12+ Windows x86-64

sparse_dot_topn-1.1.4-cp312-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (264.8 kB view details)

Uploaded CPython 3.12+ manylinux: glibc 2.17+ x86-64

sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_x86_64.whl (517.1 kB view details)

Uploaded CPython 3.12+ macOS 12.0+ x86-64

sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_arm64.whl (442.2 kB view details)

Uploaded CPython 3.12+ macOS 12.0+ ARM64

sparse_dot_topn-1.1.4-cp311-cp311-win_amd64.whl (420.7 kB view details)

Uploaded CPython 3.11 Windows x86-64

sparse_dot_topn-1.1.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (265.9 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_x86_64.whl (515.4 kB view details)

Uploaded CPython 3.11 macOS 12.0+ x86-64

sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_arm64.whl (442.0 kB view details)

Uploaded CPython 3.11 macOS 12.0+ ARM64

sparse_dot_topn-1.1.4-cp310-cp310-win_amd64.whl (420.9 kB view details)

Uploaded CPython 3.10 Windows x86-64

sparse_dot_topn-1.1.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (266.1 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_x86_64.whl (515.5 kB view details)

Uploaded CPython 3.10 macOS 12.0+ x86-64

sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_arm64.whl (442.1 kB view details)

Uploaded CPython 3.10 macOS 12.0+ ARM64

sparse_dot_topn-1.1.4-cp39-cp39-win_amd64.whl (421.9 kB view details)

Uploaded CPython 3.9 Windows x86-64

sparse_dot_topn-1.1.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (266.1 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_x86_64.whl (515.6 kB view details)

Uploaded CPython 3.9 macOS 12.0+ x86-64

sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_arm64.whl (442.1 kB view details)

Uploaded CPython 3.9 macOS 12.0+ ARM64

sparse_dot_topn-1.1.4-cp38-cp38-win_amd64.whl (442.1 kB view details)

Uploaded CPython 3.8 Windows x86-64

sparse_dot_topn-1.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (266.1 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_x86_64.whl (516.5 kB view details)

Uploaded CPython 3.8 macOS 12.0+ x86-64

sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_arm64.whl (442.2 kB view details)

Uploaded CPython 3.8 macOS 12.0+ ARM64

File details

Details for the file sparse-dot-topn-1.1.4.tar.gz.

File metadata

  • Download URL: sparse-dot-topn-1.1.4.tar.gz
  • Upload date:
  • Size: 44.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for sparse-dot-topn-1.1.4.tar.gz
Algorithm Hash digest
SHA256 40beca0897af566f03c1a9040171561a94065eb689b4a125a1261c09e572d4bd
MD5 42eb0c9dd3a360d113226eb9aacffaa4
BLAKE2b-256 cb19b43e3ba106cf35de6830040caeb09b85e71f0f7e7c05268e17a8378312c3

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp312-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp312-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 6a70ef560c36ed9d3acbe0d05060a28f321cee4e5ac06d39088515de39502032
MD5 162bac8aeeedc5db1d8be89ffd99f919
BLAKE2b-256 5385c71144ac68c4f3cc6f785fb17bf6b6a7fec0a6c38f585c078a304eb54627

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp312-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp312-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 abff778fbe7ad1b14919b99e2efd5ba60b3c81889af1c2c901e8b8a6b55c33ba
MD5 0711f7cd7931f3a04bf7b4e41f89f5d5
BLAKE2b-256 44475f16b6d4a02f7f0bcf9426cc2b0b527d740bce04a0123c2296e7f7d9d065

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_x86_64.whl
Algorithm Hash digest
SHA256 66628d4d9747713d13cf1d2651d3d260ad5c7c1e6f78ac4aed2bf34b39746497
MD5 c64fa3943871f58c0a76247d81ca1890
BLAKE2b-256 d273923cab47a5fb9db2f51663f0521c1cd3a76d9a2ce6241e80a30598c02b4f

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp312-abi3-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 e2be7e7593af1f3dc4777383a2ce652c6f36778a02ad2412cfde329d53fe59fb
MD5 9c7218d84407c1f996e56c3a7c83adad
BLAKE2b-256 901dbb4ac3473eb66b534184a216461be315a62fb9f480abdad173c7ba93ab51

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 10ed366eee4865454c3ef1db7446cf91aa00f22ca7e8e1b367fceae74380d336
MD5 f3c63c0ce6c122a3e84c84f1a665453b
BLAKE2b-256 03d47a53663a1124d45d705ce01f11864c19b1720ba529d059e316bc0aa31998

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d9b9f34a0e1074fba265ff94f60bd092bf64538d6c7846c2d7d5a659583e8f27
MD5 fae33fda3d9e70ac0b22ce5d2645c4cf
BLAKE2b-256 47eecdbd3d83b3b0de4f942474e0f0d5e72f4519199589f4665d746b9f335c84

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_x86_64.whl
Algorithm Hash digest
SHA256 b19bd39c6663ab13e8cd5665f93c53c4ee511911f7bc5a25fbfaf4e8bb10d04f
MD5 8cf8021222dbd38f60ec5d6ce2ef2cca
BLAKE2b-256 c2ab65af7754273e49b9be5ecb073f074905614a76584e3609dab0524f5dba13

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp311-cp311-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 94b43ede1139aacb906fb1354fd3915339ba25652799143f426b7910f19416c8
MD5 925db52c6b8c7085514b9d349121bead
BLAKE2b-256 9030094b2457bfefbfb39d1d6618b348d4c54552209293df77f4b91016ac2a91

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 36e45d9150946a232d0416f7cbf16219132fc297febf96f0b192da7c8759b6fa
MD5 059d6f7ca8e4eaaf07cfff6d54be52e1
BLAKE2b-256 d426815bd7602a2aaba833823ac1c46465ed1ea3270d704eb881dccfe9e6e3f8

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ba1569bf4c1ea2d347dee4a07fe7e07ba3010bc58cf135dcb608f527c0b51802
MD5 91d8191396ba646ee64b030df5620882
BLAKE2b-256 c9529d878f7d481c6c956e1ca7970381feb5299dfce63fe336e1dec14916e60a

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_x86_64.whl
Algorithm Hash digest
SHA256 3a6e7150567178126a03ebdebff5026e47b12e951be8118af0401b69752a95ff
MD5 dcd32d64ab8050b0cd1402843b4fb103
BLAKE2b-256 6d91cc121f765782c23871f743f3bb980841db7f4b74c3f2fd030133aeb9e637

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp310-cp310-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 3abd07c190b18a3934a6aaa591b41e02aa387def6c06f88163a3ab8cd040117d
MD5 06616f87351d45064bfd96bb53b905ca
BLAKE2b-256 c71157d5c8df3ab5bd6b1454bbe7608c4cbb25ed9da33ac71c1e44e7f256eb2d

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 9d5b06ad1119739edd21f0eea1c905beaeaae29f2e9d057e569ce02cfd07959d
MD5 f18968c7caa691a077311236279ce229
BLAKE2b-256 ea422509b68dd5dedf94cf7dfda0a2df4080d2f6d4f89068809a9e66218cfa34

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 037266f9efdb581275542a82129a6fc55981917441defe73d7e863441a28d52d
MD5 1d0e8359c3ce54c8523feb7d905faf2b
BLAKE2b-256 a4bd400a8022e56f59fb7f7d60b3ffe69948e3a464e9aa70539ddf1f2f244da1

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_x86_64.whl
Algorithm Hash digest
SHA256 1f13006379126b567b85f0055657af89cb7a4d6eaa0d3c070822fd822f4508d5
MD5 e3eec577efc0197d3f1e94294f08446e
BLAKE2b-256 95b9bf60a61821f23a9c2a9d5b8f9cd20cc0d50386ada18f91945beaf84f208e

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp39-cp39-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 845ee59baf0c05b2aef535e11d7101625fea10def004999f944d5ab47c44e454
MD5 4272ff7f9d2c01f09658dbf9db572f83
BLAKE2b-256 34f9a48c510aa195ffb79fcd9b252de319b8697245d32ad04a7ffb2bb7ad14db

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 7b84e50e24617d608531462b51e9a57ee940ce1f90559e242307be48a78dce42
MD5 017388fc92d841937862323ba9a0de5a
BLAKE2b-256 a80139910f7e4fed8d5f292077d058892cc0050974cbab264bcb429029ad74d7

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a16c46ba855248223133053478a332c1a9de13d68129b7ea10b1a1d9033f08dc
MD5 3dacbcf258003f4e24a317aa540ddc93
BLAKE2b-256 09287c219b81e5abc2711eed9cb03fe6d4f28d3fc690fd43f4615181e7701c7e

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_x86_64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_x86_64.whl
Algorithm Hash digest
SHA256 23e286e52c50c9e3409bdea81732ec7f1d22279e7bf08b98c8073c5bebdba89e
MD5 7329e2d0d052b4ad27e90d56fc925351
BLAKE2b-256 c14ebdcd3a78ffe42faf1c7d910589d095c904fa2587c90b4e7e67c9dc6a04ca

See more details on using hashes here.

File details

Details for the file sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for sparse_dot_topn-1.1.4-cp38-cp38-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 72cb74037db65dae1bada0f00a71c72f915264d2ffe93776dd7a984b4c485213
MD5 d1f6a4ea422531250af36b3cb80bd7cf
BLAKE2b-256 a8069754e83611bb0576cad29ded4670bd1e0b868390015901267ec3fcbf0a85

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page