Skip to main content

A package for time series data processing and modeling using ARIMA and GARCH models

Project description

Timeseries Compute

Python Versions PyPI GitHub Docker Hub Documentation

CI/CD Codacy Badge Coverage

Overview

████████╗██╗███╗   ███╗███████╗███████╗███████╗██████╗ ██╗███████╗███████╗
╚══██╔══╝██║████╗ ████║██╔════╝██╔════╝██╔════╝██╔══██╗██║██╔════╝██╔════╝
   ██║   ██║██╔████╔██║█████╗  ███████╗█████╗  ██████╔╝██║█████╗g ███████╗
   ██║   ██║██║╚██╔╝██║██╔══╝  ╚════██║██╔══╝  ██╔══██╗██║██╔══╝m ╚════██║
   ██║   ██║██║ ╚═╝ ██║███████╗███████║███████╗██║  ██║██║███████╗███████║
   ╚═╝   ╚═╝╚═╝     ╚═╝╚══════╝╚══════╝╚══════╝╚═╝  ╚═╝╚═╝╚══════╝╚══════╝
             ██████╗ ██████╗ ███╗   ███╗██████╗ ██╗   ██╗████████╗███████╗
            ██╔════╝██╔═══██╗████╗ ████║██╔══██╗██║   ██║╚══██╔══╝██╔════╝
            ██║     ██║   ██║██╔████╔██║██████╔╝██║   ██║   ██║   █████╗
            ██║     ██║   ██║██║╚██╔╝██║██╔═══╝ ██║   ██║   ██║   ██╔══╝
            ╚██████╗╚██████╔╝██║ ╚═╝ ██║██║     ╚██████╔╝   ██║   ███████╗
             ╚═════╝ ╚═════╝ ╚═╝     ╚═╝╚═╝      ╚═════╝    ╚═╝   ╚══════╝

Implementation hosted at www.spilloverlab.com.

A Python package for timeseries data processing and modeling using ARIMA and GARCH models with both univariate and multivariate capabilities.

Features

  • Price series generation for single and multiple assets
  • Data preprocessing with configurable missing data handling and scaling options
  • Stationarity testing and transformation for time series analysis
  • ARIMA modeling for time series forecasting
  • GARCH modeling for volatility forecasting and risk assessment
  • Bivariate GARCH modeling with both Constant Conditional Correlation (CCC) and Dynamic Conditional Correlation (DCC) methods
  • EWMA covariance calculation for dynamic correlation analysis
  • Portfolio risk assessment using volatility and correlation matrices
  • Market spillover effects analysis with Granger causality testing and shock transmission modeling
  • Visualization tools for interpreting complex market interactions and spillover relationships

Integration Overview

flowchart TB
    %% Styling
    classDef person fill:#7B4B94,color:#fff,stroke:#5D2B6D,stroke-width:1px
    classDef agent fill:#7B4B94,color:#fff,stroke:#5D2B6D,stroke-width:1px
    classDef system fill:#1168BD,color:#fff,stroke:#0B4884,stroke-width:1px
    classDef external fill:#999999,color:#fff,stroke:#6B6B6B,stroke-width:1px
    classDef database fill:#2E7C8F,color:#fff,stroke:#1D4E5E,stroke-width:1px
    classDef publishing fill:#E67E22,color:#fff,stroke:#D35400,stroke-width:1px
    
    %% Actors and Systems
    User((User)):::person
    AIAgent((AI Agent)):::agent
    
    %% Main Systems
    TimeSeriesFrontend["Frontend App"]:::system
    TimeSeriesPipeline["RESTful Pipeline"]:::system
    MCPServer["MCP Server"]:::system
    TimeseriesCompute["Timeseries-Compute 
    Python Package"]:::system
    
    %% Database
    TimeSeriesDB[("Relational database")]:::database
    
    %% External Systems
    ExternalDataSource[(Yahoo Finance / Stooq)]:::external
    
    %% Publishing Platforms
    PublishingPlatforms["
    GitHub
    Docker Hub
    Google Cloud Run
    PyPI
    Read the Docs"]:::publishing
    
    %% Relationships
    User -- "Uses UI" --> TimeSeriesFrontend
    AIAgent -- "Natural language requests" --> MCPServer
    TimeSeriesFrontend -- "Makes API calls to" --> TimeSeriesPipeline
    MCPServer -- "Makes API calls to" --> TimeSeriesPipeline
    TimeSeriesPipeline -- "Inserts results into" --> TimeSeriesDB
    TimeSeriesPipeline -- "imports" --> TimeseriesCompute
    User -- "pip install" --> TimeseriesCompute
    AIAgent -- "pip install" --> TimeseriesCompute
    ExternalDataSource -- "Provides time series data" --> TimeSeriesPipeline
    
    %% Publishing relationships (simplified)
    TimeSeriesFrontend  --> PublishingPlatforms
    TimeSeriesPipeline --> PublishingPlatforms
    TimeseriesCompute --> PublishingPlatforms

Quick Start

Installation

Using uv (fastest):

# Install uv
pip install uv
# create venv
uv venv
source .venv/bin/activate
uv pip install -r requirements.txt

Using venv (classic):

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install timeseries-compute

Install from GitHub using venv (latest development version):

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install git+https://github.com/garthmortensen/timeseries-compute.git

Example Usage

For univariate time series analysis:

python -m timeseries_compute.examples.example_univariate_garch

For multivariate GARCH analysis (correlation between two assets):

python -m timeseries_compute.examples.example_multivariate_garch

Docker Support

Run with Docker for isolated environments:

# build the image
docker build -t timeseries-compute:latest ./

# Run the univariate example
docker run -it timeseries-compute:latest /app/timeseries_compute/examples/example_univariate_garch.py

# Run the multivariate example
docker run -it timeseries-compute:latest /app/timeseries_compute/examples/example_multivariate_garch.py

# Get into interactive shell
docker run -it --entrypoint /bin/bash timeseries-compute:latest

Project Structure

timeseries_compute/......................
├── __init__.py                         # Package initialization and public API
├── data_generator.py                   # Synthetic price data generation with random walks and statistical properties
├── data_processor.py                   # Data transformation, missing value handling, scaling, and stationarity testing
├── export_util.py                      # Data export utilities for tracking analysis lineage
├── spillover_processor.py              # Diebold-Yilmaz spillover analysis and Granger causality testing
├── stats_model.py                      # ARIMA, GARCH, and multivariate GARCH model implementations
├── examples/............................
│   ├── __init__.py                     # Makes examples importable as a module
│   ├── example_multivariate_garch.py   # Correlation analysis between multiple markets with CC-GARCH and DCC-GARCH
│   └── example_univariate_garch.py     # Basic ARIMA and GARCH modeling for single-series forecasting
└── tests/...............................
    ├── __init__.py                     # Makes tests discoverable by pytest
    ├── test_data_generator_advanced.py # Advanced data generation features and statistical property testing
    ├── test_data_generator.py          # Basic price generation functionality testing
    ├── test_data_processor.py          # Data transformation, scaling, and stationarity testing
    ├── test_spillover_processor.py     # Spillover analysis and Granger causality testing
    ├── test_stats_model_arima.py       # ARIMA modeling with specialized fixtures and edge cases
    └── test_stats_model_garch.py       # GARCH volatility modeling with different distributions

Architectural Diagrams

Level 2: Container Diagram

flowchart TB
    %% Styling
    classDef person fill:#08427B,color:#fff,stroke:#052E56,stroke-width:1px
    classDef container fill:#438DD5,color:#fff,stroke:#2E6295,stroke-width:1px
    classDef external fill:#999999,color:#fff,stroke:#6B6B6B,stroke-width:1px
    classDef system fill:#1168BD,color:#fff,stroke:#0B4884,stroke-width:1px
    
    %% Person
    User((User)):::person
    
    %% System boundary
    subgraph TimeseriesComputeSystem["Timeseries Compute System"]
        PythonPackage["Python Package<br>[Library]<br>Core functions for analysis"]:::container
        Dockerized["Docker Container<br>[Linux]<br>Containerized deployment"]:::container
        ExampleScripts["Example Scripts<br>[Python]<br>Demonstration use cases"]:::container
        TestSuite["Test Suite<br>[pytest]<br>Validates package functionality"]:::container
        CIpipeline["CI/CD Pipeline<br>[GitHub Actions]<br>Automates testing/deployment"]:::container
        Documentation["Documentation<br>[ReadTheDocs]<br>API and usage docs"]:::container
    end
    
    %% External Systems
    ExternalDataSource[(External Data Source)]:::external
    AnalysisTool[Analysis & Visualization Tools]:::external
    PyPI[PyPI Repository]:::external
    DockerHub[Docker Hub Repository]:::external
    
    %% Relationships
    User -- "Imports [Python]" --> PythonPackage
    User -- "Runs [CLI]" --> ExampleScripts
    User -- "Reads [Web]" --> Documentation
    ExampleScripts -- "Uses" --> PythonPackage
    TestSuite -- "Tests" --> PythonPackage
    PythonPackage -- "Packaged into" --> Dockerized
    CIpipeline -- "Builds and tests" --> Dockerized
    CIpipeline -- "Runs" --> TestSuite
    CIpipeline -- "Publishes" --> PyPI
    CIpipeline -- "Publishes" --> DockerHub
    CIpipeline -- "Updates" --> Documentation
    ExternalDataSource -- "Provides data to" --> PythonPackage
    PythonPackage -- "Exports analysis to" --> AnalysisTool
    User -- "Downloads from" --> PyPI
    User -- "Runs with" --> DockerHub

Level 3: Component Diagram

flowchart TB
    %% Styling
    classDef person fill:#08427B,color:#fff,stroke:#052E56,stroke-width:1px
    classDef component fill:#85BBF0,color:#000,stroke:#5D82A8,stroke-width:1px
    classDef container fill:#438DD5,color:#fff,stroke:#2E6295,stroke-width:1px
    classDef external fill:#999999,color:#fff,stroke:#6B6B6B,stroke-width:1px
    
    %% Person
    User((User)):::person
    
    %% Package Container
    subgraph PythonPackage["Python Package"]
        DataGenerator["Data Generator<br>[Python]<br>Creates synthetic time series"]:::component
        DataProcessor["Data Processor<br>[Python]<br>Transforms and tests data"]:::component
        StatsModels["Statistical Models<br>[Python]<br>ARIMA and GARCH models"]:::component
        SpilloverProcessor["Spillover Processor<br>[Python]<br>Market interaction analysis"]:::component
        ExportUtil["Export Utility<br>[Python]<br>Data export functions"]:::component
        ExampleScripts["Example Scripts<br>[Python]<br>Usage demonstrations"]:::component
        TestSuite["Test Suite<br>[pytest]<br>Validates functionality"]:::component
        
        %% Component relationships
        ExampleScripts --> DataGenerator
        ExampleScripts --> DataProcessor
        ExampleScripts --> StatsModels
        ExampleScripts --> SpilloverProcessor
        ExampleScripts --> ExportUtil
        StatsModels --> DataProcessor
        SpilloverProcessor --> StatsModels
        SpilloverProcessor --> DataProcessor
        TestSuite --> DataGenerator
        TestSuite --> DataProcessor
        TestSuite --> StatsModels
        TestSuite --> SpilloverProcessor
        TestSuite --> ExportUtil
    end

    %% External Components
    StatsLibraries[(Statistical Libraries<br>statsmodels, arch)]:::external
    DataLibraries[(Data Libraries<br>pandas, numpy)]:::external
    VisualizationLibraries[(Visualization<br>matplotlib)]:::external
    
    %% Relationships
    User -- "Uses" --> ExampleScripts
    User -- "Uses directly" --> DataGenerator
    User -- "Uses directly" --> DataProcessor
    User -- "Uses directly" --> StatsModels
    User -- "Uses directly" --> SpilloverProcessor
    DataGenerator -- "Uses" --> DataLibraries
    DataProcessor -- "Uses" --> DataLibraries
    StatsModels -- "Uses" --> StatsLibraries
    StatsModels -- "Uses" --> DataLibraries
    ExampleScripts -- "Uses" --> VisualizationLibraries
    SpilloverProcessor -- "Uses" --> VisualizationLibraries
    ExportUtil -- "Uses" --> DataLibraries

Level 4: Code/Class Diagram

classDiagram
    %% Main Classes (actual)
    class PriceSeriesGenerator {
        +start_date: str
        +end_date: str
        +dates: pd.DatetimeIndex
        +__init__(start_date, end_date)
        +generate_correlated_prices(anchor_prices, correlation_matrix): Dict[str, list]
    }
    
    class MissingDataHandler {
        +__init__()
        +drop_na(data): pd.DataFrame
        +forward_fill(data): pd.DataFrame
    }
    
    class DataScaler {
        +scale_data_standardize(data): pd.DataFrame
        +scale_data_minmax(data): pd.DataFrame
    }
    
    class StationaryReturnsProcessor {
        +make_stationary(data, method): pd.DataFrame
        +test_stationarity(data, test): Dict
        +log_adf_results(data, p_value_threshold): None
    }
    
    class ModelARIMA {
        +data: pd.DataFrame
        +order: Tuple[int, int, int]
        +steps: int
        +models: Dict[str, ARIMA]
        +fits: Dict[str, ARIMA]
        +__init__(data, order, steps)
        +fit(): Dict[str, ARIMA]
        +summary(): Dict[str, str]
        +forecast(): Dict[str, Union[float, list]]
    }
    
    class ModelGARCH {
        +data: pd.DataFrame
        +p: int
        +q: int
        +dist: str
        +models: Dict[str, arch_model]
        +fits: Dict[str, arch_model]
        +__init__(data, p, q, dist)
        +fit(): Dict[str, arch_model]
        +summary(): Dict[str, str]
        +forecast(steps): Dict[str, float]
    }
    
    class ModelMultivariateGARCH {
        +data: pd.DataFrame
        +p: int
        +q: int
        +model_type: str
        +fits: Dict
        +cc_results: Dict
        +dcc_results: Dict
        +__init__(data, p, q, model_type)
        +fit_cc_garch(): Dict[str, Any]
        +fit_dcc_garch(lambda_val): Dict[str, Any]
    }
    
    %% Factory Classes
    class ModelFactory {
        <<static>>
        +create_model(model_type, data, order, steps, p, q, dist, mv_model_type): Union[ModelARIMA, ModelGARCH, ModelMultivariateGARCH]
    }
    
    class MissingDataHandlerFactory {
        <<static>>
        +create_handler(strategy): Callable[[pd.DataFrame], pd.DataFrame]
    }
    
    class DataScalerFactory {
        <<static>>
        +create_handler(strategy): Callable[[pd.DataFrame], pd.DataFrame]
    }
    
    class StationaryReturnsProcessorFactory {
        <<static>>
        +create_handler(strategy): StationaryReturnsProcessor
    }
    
    %% Module-level Functions (actual implementation structure)
    class DataGeneratorModule {
        <<module>>
        +set_random_seed(seed): None
        +generate_price_series(start_date, end_date, anchor_prices, random_seed, correlations): Tuple[Dict, pd.DataFrame]
    }
    
    class DataProcessorModule {
        <<module>>
        +fill_data(df, strategy): pd.DataFrame
        +scale_data(df, method): pd.DataFrame
        +scale_for_garch(df, target_scale): pd.DataFrame
        +stationarize_data(df, method): pd.DataFrame
        +test_stationarity(df, method): Dict
        +log_stationarity(adf_results, p_value_threshold): None
        +price_to_returns(prices): pd.DataFrame
        +prepare_timeseries_data(df): pd.DataFrame
        +calculate_ewma_covariance(series1, series2, lambda_val): pd.Series
        +calculate_ewma_volatility(series, lambda_val): pd.Series
    }
    
    class StatsModelModule {
        <<module>>
        +run_arima(df_stationary, p, d, q, forecast_steps): Tuple[Dict, Dict]
        +run_garch(df_stationary, p, q, dist, forecast_steps): Tuple[Dict, Dict]
        +run_multivariate_garch(df_stationary, arima_fits, garch_fits, lambda_val): Dict
        +calculate_correlation_matrix(standardized_residuals): pd.DataFrame
        +calculate_dynamic_correlation(ewma_cov, ewma_vol1, ewma_vol2): pd.Series
        +construct_covariance_matrix(volatilities, correlation): np.ndarray
        +calculate_portfolio_risk(weights, cov_matrix): Tuple[float, float]
        +calculate_stats(series): Dict
    }

    class SpilloverProcessorModule {
        <<module>>
        +test_granger_causality(series1, series2, max_lag, significance_level): Dict
        +analyze_shock_spillover(residuals1, volatility2, max_lag): Dict
        +run_spillover_analysis(df_stationary, arima_fits, garch_fits, lambda_val, max_lag, significance_level): Dict
    }
    
    class ExportUtilModule {
        <<module>>
        +export_data(data, folder, name): Any
    }
    
    %% Example Scripts
    class ExampleUnivariateGARCH {
        <<script>>
        +main(): None
    }
    
    class ExampleMultivariateGARCH {
        <<script>>
        +main(): None
    }
    
    %% Relationships - Factory patterns
    MissingDataHandlerFactory --> MissingDataHandler: creates
    DataScalerFactory --> DataScaler: creates
    StationaryReturnsProcessorFactory --> StationaryReturnsProcessor: creates
    ModelFactory --> ModelARIMA: creates
    ModelFactory --> ModelGARCH: creates
    ModelFactory --> ModelMultivariateGARCH: creates
    
    %% Module dependencies
    DataProcessorModule --> MissingDataHandler: uses
    DataProcessorModule --> DataScaler: uses
    DataProcessorModule --> StationaryReturnsProcessor: uses
    DataProcessorModule --> MissingDataHandlerFactory: uses
    DataProcessorModule --> DataScalerFactory: uses
    DataProcessorModule --> StationaryReturnsProcessorFactory: uses
    
    StatsModelModule --> ModelARIMA: uses
    StatsModelModule --> ModelGARCH: uses
    StatsModelModule --> ModelMultivariateGARCH: uses
    StatsModelModule --> ModelFactory: uses
    StatsModelModule --> DataProcessorModule: uses
    
    SpilloverProcessorModule --> StatsModelModule: uses
    SpilloverProcessorModule --> DataProcessorModule: uses
    
    %% Example script dependencies
    ExampleUnivariateGARCH --> DataGeneratorModule: uses
    ExampleUnivariateGARCH --> DataProcessorModule: uses
    ExampleUnivariateGARCH --> StatsModelModule: uses
    
    ExampleMultivariateGARCH --> DataGeneratorModule: uses
    ExampleMultivariateGARCH --> DataProcessorModule: uses
    ExampleMultivariateGARCH --> StatsModelModule: uses
    ExampleMultivariateGARCH --> ExportUtilModule: uses
    
    %% Core class usage
    DataGeneratorModule --> PriceSeriesGenerator: uses

CI/CD Process

  • Triggers: Runs when code is pushed to branches main or dev
  • pytest: Validates code across multiple Python versions and OS
  • Building: Creates package distributions and documentation
  • Publishing: Deploys to PyPI, Docker Hub and ReadTheDocs
flowchart TB
    %% Styling
    classDef person fill:#08427B,color:#fff,stroke:#052E56,stroke-width:1px
    classDef system fill:#1168BD,color:#fff,stroke:#0B4884,stroke-width:1px
    classDef external fill:#999999,color:#fff,stroke:#6B6B6B,stroke-width:1px
    classDef pipeline fill:#ff9900,color:#fff,stroke:#cc7700,stroke-width:1px
    
    %% Actors
    Developer((Developer)):::person
    
    %% Main Systems
    TimeseriesCompute["Timeseries Compute\nPython Package"]:::system
    
    %% Source Control
    GitHub["GitHub\nSource Repository"]:::external
    
    %% CI/CD Pipeline and Tools
    GitHubActions["GitHub Actions\nCI/CD Pipeline"]:::pipeline
    
    %% Distribution Platforms
    PyPI["PyPI Registry"]:::external
    DockerHub["Docker Hub"]:::external
    ReadTheDocs["ReadTheDocs"]:::external
    
    %% Code Quality Services
    Codecov["Codecov\nCode Coverage"]:::external
    
    %% Flow
    Developer -- "Commits code to" --> GitHub
    GitHub -- "Triggers on push\nto main/dev" --> GitHubActions
    
    %% Primary Jobs
    subgraph TestJob["Test Job"]
        Test["Run Tests\nPytest"]:::pipeline
        Lint["Lint with Flake8"]:::pipeline
        
        Lint --> Test
    end
    
    subgraph DockerJob["Docker Job"]
        BuildDocker["Build Docker Image"]:::pipeline
    end
    
    subgraph BuildJob["Build Job"]
        BuildPackage["Build Package\nSDist & Wheel"]:::pipeline
        VerifyPackage["Verify with Twine"]:::pipeline
        
        BuildPackage --> VerifyPackage
    end
    
    subgraph DocsJob["Docs Job"]
        BuildDocs["Generate Docs\nSphinx"]:::pipeline
        BuildUML["Generate UML\nDiagrams"]:::pipeline
        
        BuildDocs --> BuildUML
    end
    
    subgraph PublishJob["Publish Job"]
        PublishPyPI["Publish to PyPI"]:::pipeline
    end
    
    %% Job Dependencies
    GitHubActions --> TestJob
    
    TestJob --> DockerJob
    TestJob --> BuildJob
    TestJob --> DocsJob
    
    BuildJob --> PublishJob
    DocsJob --> PublishJob
    
    %% External Services Connections
    Test -- "Upload Results" --> Codecov
    BuildDocker -- "Push Image" --> DockerHub
    DocsJob -- "Deploy Documentation" --> ReadTheDocs
    PublishPyPI -- "Deploy Package" --> PyPI
    
    %% Final Products
    PyPI --> TimeseriesCompute
    DockerHub --> TimeseriesCompute
    ReadTheDocs -- "Documents" --> TimeseriesCompute

Development

Environment Setup

Option 1 (recommended):

mkdir timeseries-compute
cd timeseries-compute

# create and activate virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

pip install timeseries-compute

Option 2:

# clone the repository
git clone https://github.com/garthmortensen/timeseries-compute.git
cd timeseries-compute

# create and activate virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

pip install -e ".[dev]"

Testing

pytest --cov=timeseries_compute

Tag & Publish

Bump version in pyproject.toml and README.md

git add pyproject.toml README.md
git commit -m "version bump"
git tag v0.2.41
git push && git push --tags

Documentation

Full documentation is available at timeseries-compute.readthedocs.io.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

timeseries_compute-0.6.1.tar.gz (46.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

timeseries_compute-0.6.1-py3-none-any.whl (45.7 kB view details)

Uploaded Python 3

File details

Details for the file timeseries_compute-0.6.1.tar.gz.

File metadata

  • Download URL: timeseries_compute-0.6.1.tar.gz
  • Upload date:
  • Size: 46.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for timeseries_compute-0.6.1.tar.gz
Algorithm Hash digest
SHA256 93402de09313a92d8fbd7bcdcb1529a0750c298e8cdb9da93f3434165641faed
MD5 4ff4ebcdf2028f9df65585646ab6ba3c
BLAKE2b-256 db3f200ebcbb42eaf6995157c643c51787a9c9fe315e418bdf4243da44bdd274

See more details on using hashes here.

Provenance

The following attestation bundles were made for timeseries_compute-0.6.1.tar.gz:

Publisher: cicd.yml on garthmortensen/timeseries-compute

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file timeseries_compute-0.6.1-py3-none-any.whl.

File metadata

File hashes

Hashes for timeseries_compute-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8fd2b62db7c80229303d57777e79cce22a985c95ebd078533e91581db729f905
MD5 a085fd6e96b2e6ca24d26c70f1919757
BLAKE2b-256 de142a3752b2dbcfa24f762f5a4abb1c547203f03ae24c58076786dd685ee6f5

See more details on using hashes here.

Provenance

The following attestation bundles were made for timeseries_compute-0.6.1-py3-none-any.whl:

Publisher: cicd.yml on garthmortensen/timeseries-compute

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page