Skip to main content

Hybrid SPARQL query engine for timeseries data

Project description

chrontext: High-performance hybrid query engine for knowledge graphs and analytical data (e.g. time-series)

Chrontext allows you to use your knowledge graph to access large amounts of time-series or other analytical data. It uses a commodity SPARQL Triplestore and your existing data storage infrastructure. It currently supports time-series stored in a PostgreSQL-compatible Database such as DuckDB, Google Cloud BigQuery (SQL) and OPC UA HA, but can easily be extended to other APIs and databases. Chrontext Architecture

Chrontext forms a semantic layer that allows self-service data access, abstracting away technical infrastructure. Users can create query-based inputs for data products, that maintains these data products as the knowledge graph is maintained, and that can be deployed across heterogeneous on-premise and cloud infrastructures with the same API.

Chrontext is a high-performance Python library built in Rust using Polars, and relies heavily on packages from the Oxigraph project. Chrontext works with Apache Arrow, prefers time-series transport using Apache Arrow Flight and delivers results as Polars DataFrames.

Please reach out to Data Treehouse if you would like help trying Chrontext, or require support for a different database backend.

Installing

Chrontext is in pip, just use:

pip install chrontext

The API is documented HERE.

Example query in Python

The code assumes that we have a SPARQL-endpoint and BigQuery set up with time-series.

...
q = """
PREFIX xsd:<http://www.w3.org/2001/XMLSchema#>
PREFIX ct:<https://github.com/DataTreehouse/chrontext#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> 
PREFIX rds: <https://github.com/DataTreehouse/solar_demo/rds_power#> 
SELECT ?path ?t ?ts_pow_value ?ts_irr_value
WHERE {
    ?site a rds:Site;
    rdfs:label "Jonathanland";
    rds:functionalAspect ?block.
    # At the Block level there is an irradiation measurement:
    ?block a rds:A;
    ct:hasTimeseries ?ts_irr.
    ?ts_irr rdfs:label "RefCell1_Wm2".
    
    # At the Inverter level, there is a Power measurement
    ?block rds:functionalAspect+ ?inv.
    ?inv a rds:TBB;
    rds:path ?path;
    ct:hasTimeseries ?ts_pow.
    ?ts_pow rdfs:label "InvPDC_kW".
    
    ?ts_pow ct:hasDataPoint ?ts_pow_datapoint.
    ?ts_pow_datapoint ct:hasValue ?ts_pow_value;
        ct:hasTimestamp ?t.
    ?ts_irr ct:hasDataPoint ?ts_irr_datapoint.
    ?ts_irr_datapoint ct:hasValue ?ts_irr_value;
        ct:hasTimestamp ?t.
    FILTER(
        ?t >= "2018-08-24T12:00:00+00:00"^^xsd:dateTime && 
        ?t <= "2018-08-24T13:00:00+00:00"^^xsd:dateTime)
} ORDER BY ?path ?t 
"""
df = engine.query(q)

This produces the following DataFrame:

path t ts_pow_value ts_irr_value
str datetime[ns, UTC] f64 f64
=.A1.RG1.TBB1 2018-08-24 12:00:00 UTC 39.74 184.0
=.A1.RG1.TBB1 2018-08-24 12:00:01 UTC 39.57 184.0
=.A1.RG1.TBB1 2018-08-24 12:00:02 UTC 40.1 184.0
=.A1.RG1.TBB1 2018-08-24 12:00:03 UTC 40.05 184.0
=.A1.RG1.TBB1 2018-08-24 12:00:04 UTC 40.02 184.0
=.A5.RG9.TBB1 2018-08-24 12:59:56 UTC 105.5 427.5
=.A5.RG9.TBB1 2018-08-24 12:59:57 UTC 104.9 427.6
=.A5.RG9.TBB1 2018-08-24 12:59:58 UTC 105.6 428.0
=.A5.RG9.TBB1 2018-08-24 12:59:59 UTC 105.9 428.0
=.A5.RG9.TBB1 2018-08-24 13:00:00 UTC 105.7 428.5

API

The API is documented HERE.

Tutorial using DuckDB

In the following tutorial, we assume that you have a couple of CSV-files on disk that you want to query. We assume that you have DuckDB and chrontext installed, if not, do pip install chrontext duckdb. Installing chrontext will also install sqlalchemy, which we rely on to define the virtualized DuckDB tables.

CSV files

Our csv files look like this.

ts1.csv :

timestamp,value
2022-06-01T08:46:52,1
2022-06-01T08:46:53,10
..
2022-06-01T08:46:59,105

ts2.csv:

timestamp,value
2022-06-01T08:46:52,2
2022-06-01T08:46:53,20
...
2022-06-01T08:46:59,206

DuckDB setup:

We need to create a class with a method query that takes a SQL string its argument, returning a Polars DataFrame. In this class, we just hard code the DuckDB setup in the constructor.

import duckdb
import polars as pl

class MyDuckDB():
    def __init__(self):
        con = duckdb.connect()
        con.execute("SET TIME ZONE 'UTC';")
        con.execute("""CREATE TABLE ts1 ("timestamp" TIMESTAMPTZ, "value" INTEGER)""")
        ts_1 = pl.read_csv("ts1.csv", try_parse_dates=True).with_columns(pl.col("timestamp").dt.replace_time_zone("UTC"))
        con.append("ts1", df=ts_1.to_pandas())
        con.execute("""CREATE TABLE ts2 ("timestamp" TIMESTAMPTZ, "value" INTEGER)""")
        ts_2 = pl.read_csv("ts2.csv", try_parse_dates=True).with_columns(pl.col("timestamp").dt.replace_time_zone("UTC"))
        con.append("ts2", df=ts_2.to_pandas())
        self.con = con


    def query(self, sql:str) -> pl.DataFrame:
        # We execute the query and return it as a Polars DataFrame.
        # Chrontext expects this method to exist in the provided class.
        df = self.con.execute(sql).pl()
        return df

my_db = MyDuckDB()

Defining a virtualized SQL

We first define a sqlalchemy select query involving the two tables. It is crucial that we have a column labelled "id" here. Chrontext will modify this query when executing hybrid queries.

from sqlalchemy import MetaData, Table, Column, bindparam
metadata = MetaData()
ts1_table = Table(
    "ts1",
    metadata,
    Column("timestamp"),
    Column("value")
)
ts2_table = Table(
    "ts2",
    metadata,
    Column("timestamp"),
    Column("value")
)
ts1 = ts1_table.select().add_columns(
    bindparam("id1", "ts1").label("id"),
)
ts2 = ts2_table.select().add_columns(
    bindparam("id2", "ts2").label("id"),
)
sql = ts1.union(ts2)

Now, we are ready to define the virtualized backend. We will annotate nodes of the graph with a resource data property. These data properties will be linked to virtualized RDF triples in the DuckDB backend. The resource_sql_map decides which SQL is used for each resource property.

from chrontext import VirtualizedPythonDatabase

vdb = VirtualizedPythonDatabase(
    database=my_db,
    resource_sql_map={"my_resource": sql},
    sql_dialect="postgres"
)

The triple below will link the ex:myWidget1 to triples defined by the above sql.

ex:myWidget1 ct:hasResource "my_resource" . 

However, it will only be linked to those triples corresponding to rows where the identifier column equals the identifier associated with ex:myWidget1. Below, we define that ex:instanceA is only linked to those rows where the id column is ts1.

ex:myWidget1 ct:hasIdentifier "ts1" . 

In any such resource sql, the id column is mandatory.

Relating the Database to RDF Triples

Next, we want to relate the rows in this sql, each containing id, timestamp, value to RDF triples, using a template. It is crucial to have the column id.

from chrontext import Prefix, Variable, Template, Parameter, RDFType, Triple, XSD
ct = Prefix("ct", "https://github.com/DataTreehouse/chrontext#")
xsd = XSD()
id = Variable("id")
timestamp = Variable("timestamp")
value = Variable("value")
dp = Variable("dp")
resources = {
    "my_resource": Template(
        iri=ct.suf("my_resource"),
        parameters=[
            Parameter(id, rdf_type=RDFType.Literal(xsd.string)),
            Parameter(timestamp, rdf_type=RDFType.Literal(xsd.dateTime)),
            Parameter(value, rdf_type=RDFType.Literal(xsd.double)),
        ],
        instances=[
            Triple(id, ct.suf("hasDataPoint"), dp),
            Triple(dp, ct.suf("hasValue"), value),
            Triple(dp, ct.suf("hasTimestamp"), timestamp)
        ]
)}

This means that our instance ex:myWidget1, will be associated with a value and a timestamp (and a blank data point) for each row in ts1.csv. For instance, the first row means we have:

ex:widget1 ct:hasDataPoint _:b1 .
_:b1 ct:hasTimestamp "2022-06-01T08:46:52Z"^^xsd:dateTime .
_:b1 ct:hasValue 1 .

Chrontext is created for those cases when this is infeasibly many triples, so we do not want to materialize them, but query them.

Creating the engine and querying:

The context for our analytical data (e.g. a model of an industrial asset) has to be stored in a SPARQL endpoint. In this case, we use an embedded Oxigraph engine that comes with chrontext. Now we assemble the pieces and create the engine.

from chrontext import Engine, SparqlEmbeddedOxigraph
oxigraph_store = SparqlEmbeddedOxigraph(rdf_file="my_graph.ttl", path="oxigraph_db_tutorial")
engine = Engine(
    resources,
    virtualized_python_database=vdb,
    sparql_embedded_oxigraph=oxigraph_store)
engine.init()

Now we can use our context to query the dataset. The aggregation below are pushed into DuckDB. The example below is a bit simple, but complex conditions can identify the ?w and ?s.

q = """
    PREFIX xsd:<http://www.w3.org/2001/XMLSchema#>
    PREFIX chrontext:<https://github.com/DataTreehouse/chrontext#>
    PREFIX types:<http://example.org/types#>
    SELECT ?w (SUM(?v) as ?sum_v) WHERE {
        ?w types:hasSensor ?s .
        ?s a types:ThingCounter .
        ?s chrontext:hasTimeseries ?ts .
        ?ts chrontext:hasDataPoint ?dp .
        ?dp chrontext:hasTimestamp ?t .
        ?dp chrontext:hasValue ?v .
        FILTER(?t > "2022-06-01T08:46:53Z"^^xsd:dateTime) .
    } GROUP BY ?w
    """
df = engine.query(q)
print(df)

This produces the following result:

w sum_v
str decimal[38,0]
http://example.org/case#myWidget1 1215
http://example.org/case#myWidget2 1216

Roadmap in brief

Let us know if you have suggestions!

Stabilization

Chrontext will be put into use in the energy industry during the period, and will be stabilized as part of this process. We are very interested in your bug reports!

Support for Azure Data Explorer / KustoQL

We are likely adding support for ADX/KustoQL. Let us know if this is something that would be useful for you.

Support for Databricks SQL

We are likely adding support for Databricks SQL as the virtualization backend.

Generalization to analytical data (not just time series!)

While chrontext is currently focused on time series data, we are incrementally adding support for contextualization of arbitrary analytical data.

Support for multiple databases

Currently, we only support one database backend at a given time. We plan to support hybrid queries across multiple virtualized databases.

References

Chrontext is joint work by Magnus Bakken and Professor Ahmet Soylu at OsloMet. To read more about Chrontext, read the article Chrontext: Portable Sparql Queries Over Contextualised Time Series Data in Industrial Settings.

License

All code produced since August 1st. 2023 is copyrighted to Data Treehouse AS with an Apache 2.0 license unless otherwise noted.

All code which was produced before August 1st. 2023 copyrighted to Prediktor AS with an Apache 2.0 license unless otherwise noted, and has been financed by The Research Council of Norway (grant no. 316656) and Prediktor AS as part of a PhD Degree. The code at this state is archived in the repository at https://github.com/DataTreehouse/chrontext.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chrontext-0.9.7.tar.gz (189.5 kB view details)

Uploaded Source

Built Distributions

chrontext-0.9.7-cp311-none-win_amd64.whl (25.7 MB view details)

Uploaded CPython 3.11 Windows x86-64

chrontext-0.9.7-cp311-cp311-manylinux_2_28_x86_64.whl (30.8 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.28+ x86-64

chrontext-0.9.7-cp311-cp311-macosx_12_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.11 macOS 12.0+ ARM64

chrontext-0.9.7-cp311-cp311-macosx_11_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

chrontext-0.9.7-cp310-none-win_amd64.whl (25.7 MB view details)

Uploaded CPython 3.10 Windows x86-64

chrontext-0.9.7-cp310-cp310-manylinux_2_28_x86_64.whl (30.8 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.28+ x86-64

chrontext-0.9.7-cp310-cp310-macosx_12_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.10 macOS 12.0+ ARM64

chrontext-0.9.7-cp310-cp310-macosx_11_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

chrontext-0.9.7-cp39-none-win_amd64.whl (25.7 MB view details)

Uploaded CPython 3.9 Windows x86-64

chrontext-0.9.7-cp39-cp39-manylinux_2_28_x86_64.whl (30.8 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.28+ x86-64

chrontext-0.9.7-cp39-cp39-macosx_12_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.9 macOS 12.0+ ARM64

chrontext-0.9.7-cp39-cp39-macosx_11_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

chrontext-0.9.7-cp38-none-win_amd64.whl (25.7 MB view details)

Uploaded CPython 3.8 Windows x86-64

chrontext-0.9.7-cp38-cp38-manylinux_2_28_x86_64.whl (30.8 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.28+ x86-64

chrontext-0.9.7-cp38-cp38-macosx_12_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.8 macOS 12.0+ ARM64

chrontext-0.9.7-cp38-cp38-macosx_11_0_arm64.whl (26.0 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

File details

Details for the file chrontext-0.9.7.tar.gz.

File metadata

  • Download URL: chrontext-0.9.7.tar.gz
  • Upload date:
  • Size: 189.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.7.1

File hashes

Hashes for chrontext-0.9.7.tar.gz
Algorithm Hash digest
SHA256 10db8259cc17426777d976a1a9f39723078b929142e275e7b12ba480982059fa
MD5 7b86821b8c9e63a8082fc9bed2e0c577
BLAKE2b-256 71a0a250d6f3e5be2cba5eb9faf062937445e6bf356b5313cdd868d726116360

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp311-none-win_amd64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp311-none-win_amd64.whl
Algorithm Hash digest
SHA256 cab537bf4c7986fc3cd8b33ca8927eb93e14f63f8917de429785c7c0225cc3f8
MD5 52c3bdb7255b726ab30a787fc5c78109
BLAKE2b-256 d3f13fc9dc89724f11c839cdf3b9146cb4980e1669e2f81ada66c56458310138

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ac3b2d2510b27782bf76ba1d20b7fdde17c5fd69e5d683686e9424109170d69d
MD5 2429826e2126961eedc2dc75b43d8437
BLAKE2b-256 2b765de4f2f3b1e8da31801775de37233649d7081bcb0cdbfa9340c6c9fa4837

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp311-cp311-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp311-cp311-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 58dabc54e419d5e0a75174bfe1a63a4d83ba9a1351e640adaeebfdc7e2c6bd53
MD5 20999c9ea85983b892452d5b7da98cf6
BLAKE2b-256 ac90cc5fb1e30d2fabc277115b14a609a1423182580c7d05e5d1292c2a136602

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bb38bb7713926eeb36ecc43438d06d2c7f64d08e17c5ee4b561a990cef3c6597
MD5 d2a1c6b6ca60e815d734669395cc60f6
BLAKE2b-256 c676e31aab37bdcf4dc47d3a59c403a526da3f9f7716809bb88f5bbf4020c97c

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp310-none-win_amd64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp310-none-win_amd64.whl
Algorithm Hash digest
SHA256 ce212da8bfe1da0c5dae5b28bb262c5dbd29e94ffa55d9b9a054ba11a44cd1c6
MD5 9eff3ffb5b1fac12b3ebb3bdfbb93307
BLAKE2b-256 70bdec664543558ced37db540e3c446e328b699d367b6e213ac03108480cb0b4

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ebb4f795e7640ea08813e97e28fb66d117843258c75db12857f2d8c353a2eb2e
MD5 cdbe8762b332d40a0c3909ee99ebb026
BLAKE2b-256 46781c5732d5ff9253cb9d7073d93b3fc873d78ea283e2220fff5286eb6c8c2e

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp310-cp310-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp310-cp310-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 4f4520f9ae5e3735920bdb648a2920213f7f9a52cfbb20e089f0a1b1d132d11b
MD5 21e6ff8d7b1e76434f268c42c7c39a4b
BLAKE2b-256 4ce228e5435afc5eee3ee5d5ea95e28a0842abc486e11fdaa152311f5235c245

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2538a63973c5e04f3db06b6b700832e8d7a5063b356ec4b5286f055c6da2f88d
MD5 6e307e380777e773a18029cc2ec46a2a
BLAKE2b-256 daf50dc47be718204e3637511d0f1e4152a0178c5c1967fdf1c99e641c4c7f37

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp39-none-win_amd64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp39-none-win_amd64.whl
Algorithm Hash digest
SHA256 117923e7ac061b751c93d9ac5a0b50ef3a421d64ab2b6fe1b86c8dce74e5f45e
MD5 6e28c823412db5f52f76c2f6710e4fbf
BLAKE2b-256 82ad7b3231f8b3c1d48b27356707229cf67ede9fc66903c1ac4fbf4041050f73

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 e96f94a1f299131876db53ecdf8687797bb975644f5c75c90803edc997aa1246
MD5 34b50810620cba65807215f57588ec13
BLAKE2b-256 3e7d57ca9361fdb5fd9ad41de59298ee950d56e9f7fca630b0efda744aa756a7

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp39-cp39-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp39-cp39-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 a1f0d6af9b84a94045630e92c982637cfa23fa8a02c7b0d87f0078a931a4794c
MD5 e13f731cb2bc634a4c37d5b8223eb82d
BLAKE2b-256 a2b574af6c62c1969c8cd6a1a724e5994e218cec60da85fd39d79b90f8847e63

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 aba9ee5f6a7c7748ad6b60c85db8e10301d2622118448bf14418acc48b3d93c6
MD5 52e205fc1dc00ab26bee7c473984b072
BLAKE2b-256 08a28f4f3f38c36fe95e306f4ac7d01b8697fdd0ee8c0bc58dbe047cfc205ac4

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp38-none-win_amd64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp38-none-win_amd64.whl
Algorithm Hash digest
SHA256 4c30780163efe1bb2dd97e13abc39f98449389f2c17bd0e18207ad3a8a6acc24
MD5 1d77c2d855294b6fd77e63479b29ae2d
BLAKE2b-256 c0e9e14f074c95afe91633d38f0af69fd51a3de57bf33539212132129015f610

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp38-cp38-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp38-cp38-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3b21c6bc43a70215bb78779a823da7456f53d92df9f3d255c6a57674d95e75aa
MD5 79b321fd94d471d56210500604a53df3
BLAKE2b-256 44f146be068bc3606f15daece71e632e41f0d4d52cae4e32968bda7ff0a61294

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp38-cp38-macosx_12_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp38-cp38-macosx_12_0_arm64.whl
Algorithm Hash digest
SHA256 6c1738948dcd71a8e9cbf23115785ef4b9a2ac97ec5bf9f97c1b8861d5e4b6ba
MD5 f9144fb0b850ee59f3c6a85346c1c0ed
BLAKE2b-256 06ff743ef0ed89399a4300122e7a1ab2bb2e746e9bff8c74d2d83f0bf0a837a8

See more details on using hashes here.

File details

Details for the file chrontext-0.9.7-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for chrontext-0.9.7-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a7d7d49530f81244e0b1328aa29512641be6622964eddd0ccbb1a296634c4f71
MD5 039eb0da95f62702f20d9a73740d734a
BLAKE2b-256 27e1656304cd1436f8844b015ed012ca041a04097e744634101bdec61f1cdf3c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page