Skip to main content

Alloy Properties EXplorer using simulations

Project description

APEX: Alloy Property EXplorer using simulations

APEX: Alloy Property EXplorer using simulations, is a component of the AI Square project that involves the restructuring of the DP-Gen auto_test module to develop a versatile and extensible Python package for general alloy property testing. This package enables users to conveniently establish a wide range of property-test workflows by utilizing various computational approaches, including support for LAMMPS, VASP, and ABACUS.

New Features Update (v1.0.0)

  • Decouple property calculations into individual sub-workflow to facilitate the customization of complex property functions
  • Support one-click parallel submission of multiple workflows
  • Support run step in the single step test mode (Interaction method similar to auto_test)
  • Allow user to adjust concurrency for task submission via group_size and pool_size
  • Allow user to customize suffix of property calculation directory so that multiple tests with identical property templates but different settings can be run within one workflow
  • Refactor and optimize the command line interaction
  • Enhance robustness across diverse use scenarios, especially for the local debug mode

Table of Contents

1. Overview

APEX adopts the functionality of the second-generation alloy properties calculations and is developed utilizing the dflow framework. By integrating the benefits of cloud-native workflows, APEX streamlines the intricate procedure of automatically testing various configurations and properties. Owing to its cloud-native characteristic, APEX provides users with a more intuitive and user-friendly interaction, enhancing the overall user experience by eliminating concerns related to process control, task scheduling, observability, and disaster tolerance.

The comprehensive architecture of APEX is demonstrated below:

Fig1

Figure 1. APEX schematic diagram

APEX consists of three types of pre-defined workflow that users can submit: relaxation, property, and joint. The relaxation and property sub-workflow comprise three sequential steps: Make, Run, and Post, while the joint workflow essentially combines the relaxation and property workflows into a comprehensive workflow.

The relaxation process begins with the initial POSCAR supplied by the user, which is used to generate crucial data such as the final relaxed structure and its corresponding energy, forces, and virial tensor. This equilibrium state information is essential for input into the property workflow, enabling further calculations of alloy properties. Upon completion, the final results are automatically retrieved and downloaded to the original working directory.

In both the relaxation and property workflows, the Make step prepares the corresponding computational tasks. These tasks are then transferred to the Run step that is responsible for task dispatch, calculation monitoring, and retrieval of completed tasks (implemented through the DPDispatcher plugin). Upon completion of all tasks, the Post step is initiated to collect data and obtain the desired property results.

APEX currently offers computation methods for the following alloy properties:

  • Equation of State (EOS)
  • Elastic constants
  • Surface energy
  • Interstitial formation energy
  • Vacancy formation energy
  • Generalized stacking fault energy (Gamma line)

Moreover, APEX supports three types of calculators: LAMMPS for molecular dynamics simulations, and VASP and ABACUS for first-principles calculations.

2. Easy Install

Easy install by

pip install apex-flow

You may also clone the package firstly by

git clone https://github.com/deepmodeling/APEX.git

then install APEX by

cd APEX
pip install .

3. User Guide

3.1. Before submission

In APEX, three vital elements need to be prepared before submission of a workflow:

  • One global JSON file containing parameters to configure dflow and other global settings (default: "./global.json")
  • Calculation JSON file containing parameters related to calculation (relaxation and property test)
  • Work directory consists of necessary files indicated in above JSON files together with initial structures (default: "./")

3.1.1. Global Setting

The instructions regarding global configuration, dflow, and DPDispatcher specific settings must be stored in a JSON format file. The table below describes some crucial keywords, classified into three categories:

  • Basic config

    Key words Data structure Default Description
    apex_image_name String zhuoy/apex_amd64 Image for step other than run. One can build this Docker image via prepared Dockerfile
    run_image_name String None Image of calculator for run step. Use {calculator}_image_name to indicate corresponding image for higher priority.
    run_command String None Shell command for run step. Use {calculator}_run_command to indicate corresponding command for higher priority.
    group_size Int 1 Number of tasks per parallel run group.
    pool_size Int 1 For multi tasks per parallel group, the pool size of multiprocessing pool to handle each task (1 for serial, -1 for infinity)
    upload_python_package Optional[List] None Extra python packages needed to be used in the container.
    debug_pool_workers Int 1 Pool size of parallel tasks running in the debug mode
  • Dflow config

    Key words Data structure Default Description
    dflow_host String https://127.0.0.1:2746 Url of dflow server
    k8s_api_server String https://127.0.0.1:2746 Url of kubernetes API server
    dflow_config Optional[Dict] None Specify more detailed dflow config in a nested dictionary with higher priority (See dflow document for more detail).
    dflow_s3_config Optional[Dict] None Specify dflow s3 repository config in a nested dictionary with higher priority (See dflow document for more detail).
  • Dispatcher config (One may refer to DPDispatcher’s documentation for details of the following parameters)

    Key words Data structure Default Description
    context_type String None Context type to connect to the remote server
    batch_type String None System to dispatch tasks
    local_root String "./" Local root path
    remote_root String None Remote root path
    remote_host String None Remote root path
    remote_username String None Remote user name
    remote_password String None Remote user password
    port Int 22 Remote port
    machine Optional[Dict] None Complete machine setting dictionary defined in the DPDispatcher with higher priority
    resources Optional[Dict] None Complete resources setting dictionary defined in the DPDispatcher with higher priority
    task Optional[Dict] None Complete task setting dictionary defined in the DPDispatcher with higher priority
  • Bohrium (additonal dispatcher config to be specified when you want to quickly adopt the pre-built dflow service or scientific computing resources on the Bohrium platform )

    Key words Data structure Default Description
    email String None Email of your Bohrium account
    phone String None Phone number of your Bohrium account
    password String None Password of your Bohrium account
    program_id Int None Program ID of your Bohrium account
    scass_type String None Node type provided by Bohrium

Please refer to the Quick Start section for various instances of global JSON examples in different situations.

3.1.2. Calculation Parameters

The method for indicating parameters in alloy property calculations is akin to the previous dpgen.autotest approach. There are three categories of JSON files that determine the parameters to be passed to APEX, based on their contents.

Categories calculation parameter files:

Type File format Dictionary contained Usage
Relaxation json structures; interaction; Relaxation For relaxation worflow
Property json structures; interaction; Properties For property worflow
Joint json structures; interaction; Relaxation; Properties For relaxation, property and joint worflows

It should be noted that files such as POSCAR, located within the structure directory, or any other files specified within the JSON file should be defined as relative path to the working directory and prepared in advanced.

Below are three examples (for detailed explanations of each parameter, please refer to the Hands-on_auto-test documentation for further information):

  • Relaxation parameter file
    {
      "structures":            ["confs/std-*"],
      "interaction": {
              "type":           "deepmd",
              "model":          "frozen_model.pb",
              "type_map":       {"Mo": 0}
        },
      "relaxation": {
              "cal_setting":   {"etol":       0,
                                "ftol":     1e-10,
                                "maxiter":   5000,
                                "maximal":  500000}
        }
    }
    
  • Property parameter file
    {
      "structures":    ["confs/std-*"],
      "interaction": {
          "type":          "deepmd",
          "model":         "frozen_model.pb",
          "type_map":      {"Mo": 0}
      },
      "properties": [
          {
            "type":         "eos",
            "skip":         false,
            "vol_start":    0.6,
            "vol_end":      1.4,
            "vol_step":     0.1,
            "cal_setting":  {"etol": 0,
                            "ftol": 1e-10}
          },
          {
            "type":         "elastic",
            "skip":         false,
            "norm_deform":  1e-2,
            "shear_deform": 1e-2,
            "cal_setting":  {"etol": 0,
                            "ftol": 1e-10}
          }
          ]
    }
    
  • Joint parameter file
    {
      "structures":            ["confs/std-*"],
      "interaction": {
            "type":           "deepmd",
            "model":          "frozen_model.pb",
            "type_map":       {"Mo": 0}
        },
      "relaxation": {
              "cal_setting":   {"etol":       0,
                              "ftol":     1e-10,
                              "maxiter":   5000,
                              "maximal":  500000}
        },
      "properties": [
        {
          "type":         "eos",
          "skip":         false,
          "vol_start":    0.6,
          "vol_end":      1.4,
          "vol_step":     0.1,
          "cal_setting":  {"etol": 0,
                          "ftol": 1e-10}
        },
        {
          "type":         "elastic",
          "skip":         false,
          "norm_deform":  1e-2,
          "shear_deform": 1e-2,
          "cal_setting":  {"etol": 0,
                          "ftol": 1e-10}
        }
        ]
    }
    
3.1.2.1. EOS
Key words Data structure Example Description
vol_start Float 0.9 The starting volume related to the equilibriumstructure
vol_end Float 1.1 The maximum volume related to the equilibriumstructure
vol_step Float 0.01 The volume increment related to the equilibriumstructure
3.1.2.2. Elastic
Key words Data structure Example Description
norm_deform Float 1.1 The biggest volume related to the equilibriumstructure
shear_deform Float 0.01 The volume increment related to the equilibriumstructure
3.1.2.3. Surface
Key words Data structure Example Description
min_slab_size Int 10 Minimum size of slab thickness
min_vacuum_size Int 11 Minimum size of vacuume width
pert_xz Float 0.01 Perturbation through xz direction used tocompute surface energy, default = 0.01
max_miller Int 2 The maximum miller index number of surface generated
3.1.2.4. Vacancy
Key words Data structure Example Description
supercell List[Int] [3, 3, 3] The supercell to be constructed, default = [1,1,1]
3.1.2.5. Interstitial
Key words Data structure Example Description
insert_ele List[String] ["Al"] The element to be inserted
supercell List[Int] [3, 3, 3] The supercell to be constructed, default =[1,1,1]
conf_filters Dict "min_dist": 1.5 Filter out the undesirable configuration
3.1.2.6. Gamma Line
Fig2

Figure 2. Schematic diagram of Gamma line calculation

The Gamma line (generalized stacking fault energy) function of APEX calculates energy of a series slab structures of specific crystal plane, which displaced in the middle along a slip vector as illustrated in Figure 2. In APEX, the slab structrures are defined by a plane miller index and two orthogonal directions (primary and secondary) on the plane. The slip vector is always along the primary directions with slip length defined by user or default settings. Thus, by indicating plane_miller and the slip_direction (AKA, primary direction), a slip system can be defined.

For most common slip systems in respect to FCC, BCC and HCP crystal structures, slip direction, secondary direction and default fractional slip lengths are already documented and listed below (Users are strongly advised to follow those pre-defined slip system, or may need to double-check the generated slab structure, as unexpected results may occur especially for system like HCP):

  • FCC

    Plane miller index Slip direction Secondary direction Default slip length
    $(001)$ $[100]$ $[010]$ $a$
    $(110)$ $[\bar{1}10]$ $[001]$ $\sqrt{2}a$
    $(111)$ $[11\bar{2}]$ $[\bar{1}10]$ $\sqrt{6}a$
    $(111)$ $[\bar{1}\bar{1}2]$ $[1\bar{1}0]$ $\sqrt{6}a$
    $(111)$ $[\bar{1}10]$ $[\bar{1}\bar{1}2]$ $\sqrt{2}a$
    $(111)$ $[1\bar{1}0]$ $[11\bar{2}]$ $\sqrt{2}a$
  • BCC

    Plane miller index Slip direction Secondary direction Default slip length
    $(001)$ $[100]$ $[010]$ $a$
    $(111)$ $[\bar{1}10]$ $[\bar{1}\bar{1}2]$ $\frac{\sqrt{2}}{2}a$
    $(110)$ $[\bar{1}11]$ $[00\bar{1}]$ $\frac{\sqrt{3}}{2}a$
    $(110)$ $[1\bar{1}\bar{1}]$ $[001]$ $\frac{\sqrt{3}}{2}a$
    $(112)$ $[11\bar{1}]$ $[\bar{1}10]$ $\frac{\sqrt{3}}{2}a$
    $(112)$ $[\bar{1}\bar{1}1]$ $[1\bar{1}0]$ $\frac{\sqrt{3}}{2}a$
    $(123)$ $[11\bar{1}]$ $[\bar{2}10]$ $\frac{\sqrt{3}}{2}a$
    $(123)$ $[\bar{1}\bar{1}1]$ $[2\bar{1}0]$ $\frac{\sqrt{3}}{2}a$
  • HCP (Bravais lattice)

    Plane miller index Slip direction Secondary direction Default slip length
    $(0001)$ $[2\bar{1}\bar{1}0]$ $[01\bar{1}0]$ $a$
    $(0001)$ $[1\bar{1}00]$ $[01\bar{1}0]$ $\sqrt{3}a$
    $(0001)$ $[10\bar{1}0]$ $[01\bar{1}0]$ $\sqrt{3}a$
    $(01\bar{1}0)$ $[\bar{2}110]$ $[000\bar{1}]$ $a$
    $(01\bar{1}0)$ $[0001]$ $[\bar{2}110]$ $c$
    $(01\bar{1}0)$ $[\bar{2}113]$ $[000\bar{1}]$ $\sqrt{a^2+c^2}$
    $(\bar{1}2\bar{1}0)$ $[\bar{1}010]$ $[000\bar{1}]$ $\sqrt{3}a$
    $(\bar{1}2\bar{1}0)$ $[0001]$ $[\bar{1}010]$ $c$
    $(01\bar{1}1)$ $[\bar{2}110]$ $[\bar{1}2\bar{1}\bar{3}]$ $a$
    $(01\bar{1}1)$ $[\bar{1}2\bar{1}\bar{3}]$ $[2\bar{1}\bar{1}0]$ $\sqrt{a^2+c^2}$
    $(01\bar{1}1)$ $[0\bar{1}12]$ $[\bar{1}2\bar{1}\bar{3}]$ $\sqrt{3a^2+4c^2}$
    $(\bar{1}2\bar{1}2)$ $[10\bar{1}0]$ $[1\bar{2}13]$ $\sqrt{3}a$
    $(\bar{1}2\bar{1}2)$ $[1\bar{2}13]$ $[\bar{1}010]$ $\sqrt{a^2+c^2}$

The parameters related to Gamma line calculation are listed below:

Key words Data structure Default Description
plane_miller Sequence[Int] None Miller index of the target slab
slip_direction Sequence[Int] None Miller index of slip (primary) direction of the slab
slip_length Int|Float; Sequence[Int|Float, Int|Float, Int|Float] Refer to specific slip system as the table shows above, or 1 if not indicated Slip length along the primary direction with default unit set by user or default setting. As for format of [x, y, z], the length equals to $\sqrt{(xa)^2+(yb)^2+(zc)^2}$
plane_shift Int|Float 0 Shift of displacement plane with unit of lattice parameter $c$ (positive for upwards). This allows creating slip plane within narrowly-spaced planes (see ref).
n_steps Int 10 Number of steps to displace slab along the slip vector
vacuum_size Int|Float 0 Thickness of vacuum layer added around the slab with unit of Angstrom
supercell_size Sequence[Int, Int, Int] [1, 1, 5] Size of generated supper cell based on slab structure
add fix Sequence[Str, Str, Str] ["true","true","false"] Whether to add fix position constraint along x, y and z direction during calculation

Here is an example:

{
    "type":            "gamma",
    "skip":            true,
    "plane_miller":    [0,0,1],
    "slip_direction":  [1,0,0],
    "hcp": {
      	"plane_miller":    [0,1,-1,1],
      	"slip_direction":  [-2,1,1,0],
        "slip_length":     [1,0,1],
        "plane_shift": 0.25
  	},
    "supercell_size":   [1,1,6],
    "vacuum_size": 10,
    "add_fix": ["true","true","false"],
    "n_steps":         10
  }

It should be noted that for various crystal structures, users can further define slip parameters within the respective nested dictionaries, which will be prioritized for adoption. In above example, the slip system configuration within the "hcp" dictionary will be utilized.

3.2. Command

APEX currently supports two seperate run modes: workflow submission (running via dflow) and single-step test (running without dflow).

3.2.1. Workflow Submission

APEX will execute a specific dflow workflow upon each invocation of the command in the format: apex submit [-h] [-c [CONFIG]] [-w WORK [WORK ...]] [-d] [-f {relax,props,joint}] parameter [parameter ...]. The type of workflow and calculation method will be automatically determined by APEX based on the parameter file provided by the user. Additionally, users can specify the workflow type, configuration JSON file, and work directory through an optional argument (Run apex submit -h for help). Here is an example to submit a joint workflow:

apex submit param_relax.json param_props.json -c ./global_bohrium.json -w 'dp_demo_0?' 'eam_demo'

if no config JSON and work directory is specified, ./global.json and ./ will be passed as default values respectively.

3.2.2. Single-Step Test

APEX also provides a single-step test mode, which can run Make run and Post step individually under local enviornment. Please note that one needs to run command under the work directory in this mode. User can invoke them by format of apex test [-h] [-m [MACHINE]] parameter {make_relax,run_relax,post_relax,make_props,run_props,post_props} (Run apex test -h for help). Here is a example to do relaxation in this mode:

  1. Firstly, generate relaxation tasks by
    apex test param_relax.json make_relax
    
  2. Then dispatch tasks by
    apex test param_relax.json run_relax -m machine.json
    
    where machine.json is a JSON file to define dispatch method, containing machine, resources, task dictionaries and run_command as listed in DPDispatcher’s documentation. Here is an example to submit tasks to a Slurm managed remote HPC:
     {
       "run_command": "lmp -i in.lammps -v restart 0",
       "machine": {
           "batch_type": "Slurm",
           "context_type": "SSHContext",
           "local_root" : "./",
           "remote_root": "/hpc/home/hku/zyl/Downloads/remote_tasks",
           "remote_profile":{
               "hostname": "***.**.**.**",
               "username": "USERNAME",
               "password": "PASSWD",
               "port": 22,
               "timeout": 10
           }
       },
       "resources":{
           "number_node": 1,
           "cpu_per_node": 4,
           "gpu_per_node": 0,
           "queue_name": "apex_test",
           "group_size": 1,
           "module_list": ["deepmd-kit/2.1.0/cpu_binary_release"],
           "custom_flags": [
                 "#SBATCH --partition=xlong",
                 "#SBATCH --ntasks=1",
                 "#SBATCH --mem=10G",
                 "#SBATCH --nodes=1",
                 "#SBATCH --time=1-00:00:00"
           ]
       }
     }
    
  3. Finally, as all tasks are finished, post process by
    apex test param_relax.json post_relax
    

The property test can follow similar approach.

4. Quick Start

We present several case studies as introductory illustrations of APEX, tailored to distinct user scenarios. For our demonstration, we will utilize a LAMMPS_example to compute the Equation of State (EOS) and elastic constants of molybdenum in both Body-Centered Cubic (BCC) and Face-Centered Cubic (FCC) phases. To begin, we will examine the files prepared within the working directory for this specific case.

lammps_demo
├── confs
│   ├── std-bcc
│   │   └── POSCAR
│   └── std-fcc
│       └── POSCAR
├── frozen_model.pb
├── global_bohrium.json
├── global_hpc.json
├── param_joint.json
├── param_props.json
└── param_relax.json

There are three types of parameter files and two types of global config files, as well as a force-field potential file of molybdenum frozen_model.pb. Under the directory of confs, structure file POSCAR of both phases have been prepared respectively.

4.1. In the Bohrium

The most efficient method for submitting an APEX workflow is through the preconfigured execution environment of dflow on the Bohrium platform. To do this, it may be necessary to create an account on Bohrium. Below is an example of a global.json file for this approach.

{
    "dflow_host": "https://workflows.deepmodeling.com",
    "k8s_api_server": "https://workflows.deepmodeling.com",
    "batch_type": "Bohrium",
    "context_type": "Bohrium",
    "email": "YOUR_EMAIL",
    "password": "YOUR_PASSWD",
    "program_id": 1234,
    "apex_image_name":"registry.dp.tech/dptech/prod-11045/apex-dependencies:0.0.3",
    "lammps_image_name": "registry.dp.tech/dptech/prod-11045/deepmd-kit:deepmd-kit2.1.1_cuda11.6_gpu",
    "lammps_run_command":"lmp -in in.lammps",
    "scass_type":"c8_m31_1 * NVIDIA T4"
}

Then, one can submit a relaxation workflow via:

apex submit param_relax.json -c global_bohrium.json

Remember to replace the values of email, password and program_id of your own before submit. As for image used, you can either built your own or use public images from Bohrium or pulling from the Docker Hub. Once the workflow is submitted, one can monitor it on https://workflows.deepmodeling.com.

4.2. In a Local Argo Service

Additionally, a dflow environment can be installed on a local computer by executing installation scripts located in the dflow repository (User can also refer to the dflow service setup manual for more details). For instance, to install on a Linux system without root access:

bash install-linux-cn.sh

This process will automatically configure the required local tools, including Docker, Minikube, and Argo service, with the default port set to 127.0.0.1:2746. Consequently, one can modify the global_hpc.json file to submit a workflow to this container without needing a Bohrium account. Here is an example:

{
    "apex_image_name":"zhuoyli/apex_amd64",
    "run_image_name": "zhuoyli/apex_amd64",
    "run_command":"lmp -in in.lammps",
    "batch_type": "Slurm",
    "context_type": "SSHContext",
    "local_root" : "./",
    "remote_root": "/hpc/home/zyl/Downloads/remote_tasks",
    "remote_host": "123.12.12.12",
    "remote_username": "USERNAME",
    "remote_password": "PASSWD",
    "resources":{
        "number_node": 1,
        "cpu_per_node": 4,
        "gpu_per_node": 0,
        "queue_name": "apex_test",
        "group_size": 1,
        "module_list": ["deepmd-kit/2.1.0/cpu_binary_release"],
        "custom_flags": [
            "#SBATCH --partition=xlong",
            "#SBATCH --ntasks=4",
            "#SBATCH --mem=10G",
            "#SBATCH --nodes=1",
            "#SBATCH --time=1-00:00:00"
            ]
       }
}

In this example, we attempt to distribute tasks to a remote node managed by Slurm. Users can replace the relevant parameters within the machine dictionary or specify resources and tasks according to DPDispatcher rules.

For the APEX image, it is publicly available on Docker Hub and can be pulled automatically. Users may also choose to pull the image beforehand or create their own Docker image in the Minikube environment locally using a Dockerfile (please refer to Docker's documentation for building instructions) to expedite pod initialization.

Then, one can submit a relaxation workflow via:

apex submit param_relax.json -c global_hpc.json

Upon submission of the workflow, progress can be monitored at https://127.0.0.1:2746.

4.3. In a Local Environment

If your local computer experiences difficulties connecting to the internet, APEX offers a workflow local debug mode that allows the flow to operate in a basic Python3 environment, independent of the Docker container. However, users will not be able to monitor the workflow through the Argo UI.

To enable this feature, users can add an additional optional argument -d to the origin submission command, as demonstrated below:

apex submit -d param_relax.json -c global_hpc.json

In this approach, the user is not required to specify an image for executing APEX. Rather, APEX should be pre-installed in the default Python3 environment to ensure proper functioning.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apex-flow-1.0.1.tar.gz (104.9 kB view details)

Uploaded Source

Built Distribution

apex_flow-1.0.1-py3-none-any.whl (134.0 kB view details)

Uploaded Python 3

File details

Details for the file apex-flow-1.0.1.tar.gz.

File metadata

  • Download URL: apex-flow-1.0.1.tar.gz
  • Upload date:
  • Size: 104.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for apex-flow-1.0.1.tar.gz
Algorithm Hash digest
SHA256 91fd997ebb73eae11d3749f9bd39f3e18431aaa769005873b3f185115160bdbf
MD5 47b62374c647c6903d8e4bd421204336
BLAKE2b-256 0bf2e0f2a9d3a8ff5e9a6aa64db4fc8e89dec4d1e5a5cbaf09b420a837fdfbf1

See more details on using hashes here.

File details

Details for the file apex_flow-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: apex_flow-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 134.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for apex_flow-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 14b5b11504bfd472e1ae7b1e37e56b803c12f235ff44502bf02d3484a319776c
MD5 4f9f04206135778486c85247f1dd829f
BLAKE2b-256 b1df00edf0fbd68b0ed8fe9b7ba8e894583059332464dbe26045c5bcec3b0e97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page