Skip to main content

No project description provided

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install -e .

Load a pretrained tokenizer from the Hub

from tokenizers import Tokenizer

tokenizer = Tokenizer.from_pretrained("bert-base-cased")

Using the provided Tokenizers

We provide some pre-build tokenizers to cover the most common cases. You can easily load one of these using some vocab.json and merges.txt files:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train them just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# Now, let's use it:
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory/my-bpe.tokenizer.json")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

Whenever these provided tokenizers don't give you enough freedom, you can build your own tokenizer, by putting all the different parts you need together. You can check how we implemented the provided tokenizers and adapt them easily to your own needs.

Building a byte-level BPE

Here is an example showing how to build your own byte-level BPE by putting all the different pieces together, and then saving it to a single file:

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(
    vocab_size=20000,
    min_frequency=2,
    initial_alphabet=pre_tokenizers.ByteLevel.alphabet()
)
tokenizer.train([
    "./path/to/dataset/1.txt",
    "./path/to/dataset/2.txt",
    "./path/to/dataset/3.txt"
], trainer=trainer)

# And Save it
tokenizer.save("byte-level-bpe.tokenizer.json", pretty=True)

Now, when you want to use this tokenizer, this is as simple as:

from tokenizers import Tokenizer

tokenizer = Tokenizer.from_file("byte-level-bpe.tokenizer.json")

encoded = tokenizer.encode("I can feel the magic, can you?")

Typing support and stub.py

The compiled PyO3 extension does not expose type annotations, so editors and type checkers would otherwise see most objects as Any. The stub.py helper walks the loaded extension modules, renders .pyi stub files (plus minimal forwarding __init__.py shims), and formats them so that tools like mypy/pyright can understand the public API. Run python stub.py whenever you change the Python-visible surface to keep the generated stubs in sync.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.22.2.tar.gz (372.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl (3.4 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ s390x

tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (3.7 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ppc64le

tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.2 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ARMv7l

tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ARM64

tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl (3.4 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ s390x

tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (3.7 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ppc64le

tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.2 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ARMv7l

tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded PyPymanylinux: glibc 2.17+ ARM64

tokenizers-0.22.2-cp39-abi3-win_arm64.whl (2.6 MB view details)

Uploaded CPython 3.9+Windows ARM64

tokenizers-0.22.2-cp39-abi3-win_amd64.whl (2.7 MB view details)

Uploaded CPython 3.9+Windows x86-64

tokenizers-0.22.2-cp39-abi3-win32.whl (2.5 MB view details)

Uploaded CPython 3.9+Windows x86

tokenizers-0.22.2-cp39-abi3-musllinux_1_2_x86_64.whl (10.0 MB view details)

Uploaded CPython 3.9+musllinux: musl 1.2+ x86-64

tokenizers-0.22.2-cp39-abi3-musllinux_1_2_i686.whl (9.9 MB view details)

Uploaded CPython 3.9+musllinux: musl 1.2+ i686

tokenizers-0.22.2-cp39-abi3-musllinux_1_2_armv7l.whl (9.6 MB view details)

Uploaded CPython 3.9+musllinux: musl 1.2+ ARMv7l

tokenizers-0.22.2-cp39-abi3-musllinux_1_2_aarch64.whl (9.5 MB view details)

Uploaded CPython 3.9+musllinux: musl 1.2+ ARM64

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ x86-64

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl (3.4 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ s390x

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl (3.7 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ ppc64le

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl (3.5 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ i686

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl (3.2 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ ARMv7l

tokenizers-0.22.2-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.3 MB view details)

Uploaded CPython 3.9+manylinux: glibc 2.17+ ARM64

tokenizers-0.22.2-cp39-abi3-macosx_11_0_arm64.whl (3.0 MB view details)

Uploaded CPython 3.9+macOS 11.0+ ARM64

tokenizers-0.22.2-cp39-abi3-macosx_10_12_x86_64.whl (3.1 MB view details)

Uploaded CPython 3.9+macOS 10.12+ x86-64

File details

Details for the file tokenizers-0.22.2.tar.gz.

File metadata

  • Download URL: tokenizers-0.22.2.tar.gz
  • Upload date:
  • Size: 372.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.10.2

File hashes

Hashes for tokenizers-0.22.2.tar.gz
Algorithm Hash digest
SHA256 473b83b915e547aa366d1eee11806deaf419e17be16310ac0a14077f1e28f917
MD5 431a445a1971ee73c19ace0358584a35
BLAKE2b-256 736ff80cfef4a312e1fb34baf7d85c72d4411afde10978d4657f8cdd811d3ccc

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl
Algorithm Hash digest
SHA256 f01a9c019878532f98927d2bacb79bbb404b43d3437455522a00a30718cdedb5
MD5 f7f1b0de82a613674848dc083ccca710
BLAKE2b-256 598cb1c87148aa15e099243ec9f0cf9d0e970cc2234c3257d558c25a2c5304e6

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 64d94e84f6660764e64e7e0b22baa72f6cd942279fdbb21d46abd70d179f0195
MD5 0ddeaab811898bf0d6c3a8bf0dbb079f
BLAKE2b-256 6f6e55553992a89982cd12d4a66dddb5e02126c58677ea3931efcbe601d419db

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 e10bf9113d209be7cd046d40fbabbaf3278ff6d18eb4da4c500443185dc1896c
MD5 15cb8df692d3a6c5aa57a91b74209aeb
BLAKE2b-256 46cde4851401f3d8f6f45d8480262ab6a5c8cb9c4302a790a35aa14eeed6d2fd

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 753d47ebd4542742ef9261d9da92cd545b2cacbb48349a1225466745bb866ec4
MD5 d07856b8d5fe12e56f36e98b95e0ae7d
BLAKE2b-256 8404655b79dbcc9b3ac5f1479f18e931a344af67e5b7d3b251d2dcdcd7558592

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl
Algorithm Hash digest
SHA256 143b999bdc46d10febb15cbffb4207ddd1f410e2c755857b5a0797961bbdc113
MD5 924d39b2e92e8a552756f911fe80b783
BLAKE2b-256 fc605b440d251863bd33f9b0a416c695b0309487b83abf6f2dafe9163a3aeac2

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 1a62ba2c5faa2dd175aaeed7b15abf18d20266189fb3406c5d0550dd34dd5f37
MD5 f64729845e7d06a991510c4882bd7c1f
BLAKE2b-256 91fc6aa749d7d443aab4daa6f8bc00338389149fd2534e25b772285c3301993e

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 1e50f8554d504f617d9e9d6e4c2c2884a12b388a97c5c77f0bc6cf4cd032feee
MD5 0aa62f62ea50d1606ef84c61fb2d22d7
BLAKE2b-256 5b6e3bc33cae8bf114afa5a98e35eb065c72b7c37d01d370906a893f33881767

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 319f659ee992222f04e58f84cbf407cfa66a65fe3a8de44e8ad2bc53e7d99012
MD5 9c579674b75d7af39aa968cf88365959
BLAKE2b-256 27468d7db1dff181be50b207ab0a7483a22d5c3a4f903a9afc7cf7e465ad8109

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-win_arm64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-win_arm64.whl
Algorithm Hash digest
SHA256 9ce725d22864a1e965217204946f830c37876eee3b2ba6fc6255e8e903d5fcbc
MD5 1a824a6d7bdff5dd422a37294b163ba9
BLAKE2b-256 72f40de46cfa12cdcbcd464cc59fde36912af405696f687e53a091fb432f694c

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 c9ea31edff2968b44a88f97d784c2f16dc0729b8b143ed004699ebca91f05c48
MD5 e9982454914dc3e7000d750e2932627d
BLAKE2b-256 65710670843133a43d43070abeb1949abfdef12a86d490bea9cd9e18e37c5ff7

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-win32.whl.

File metadata

  • Download URL: tokenizers-0.22.2-cp39-abi3-win32.whl
  • Upload date:
  • Size: 2.5 MB
  • Tags: CPython 3.9+, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.10.2

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-win32.whl
Algorithm Hash digest
SHA256 a6bf3f88c554a2b653af81f3204491c818ae2ac6fbc09e76ef4773351292bc92
MD5 a9c53def9d49ad4ca3747f559d477e93
BLAKE2b-256 fd18a545c4ea42af3df6effd7d13d250ba77a0a86fb20393143bbb9a92e434d4

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-musllinux_1_2_x86_64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-musllinux_1_2_x86_64.whl
Algorithm Hash digest
SHA256 38337540fbbddff8e999d59970f3c6f35a82de10053206a7562f1ea02d046fa5
MD5 6b0131ac6a777310d4d230ed0f65fbc4
BLAKE2b-256 05a1d62dfe7376beaaf1394917e0f8e93ee5f67fea8fcf4107501db35996586b

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-musllinux_1_2_i686.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-musllinux_1_2_i686.whl
Algorithm Hash digest
SHA256 791135ee325f2336f498590eb2f11dc5c295232f288e75c99a36c5dbce63088a
MD5 47a11997602e624e43bf04a624d1b143
BLAKE2b-256 1604fed398b05caa87ce9b1a1bb5166645e38196081b225059a6edaff6440fac

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-musllinux_1_2_armv7l.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-musllinux_1_2_armv7l.whl
Algorithm Hash digest
SHA256 37ae80a28c1d3265bb1f22464c856bd23c02a05bb211e56d0c5301a435be6c1a
MD5 ff4c7d8a6e5cc963fa04740ec5b985d7
BLAKE2b-256 6cfb66e2da4704d6aadebf8cb39f1d6d1957df667ab24cff2326b77cda0dcb85

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-musllinux_1_2_aarch64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-musllinux_1_2_aarch64.whl
Algorithm Hash digest
SHA256 29c30b83d8dcd061078b05ae0cb94d3c710555fbb44861139f9f83dcca3dc3e4
MD5 059af5935c7b134a027e3e42ab00215e
BLAKE2b-256 1d285f9f5a4cc211b69e89420980e483831bcc29dade307955cc9dc858a40f01

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 369cc9fc8cc10cb24143873a0d95438bb8ee257bb80c71989e3ee290e8d72c67
MD5 7059fc15bebe70af1dcb30cad76811f6
BLAKE2b-256 2e76932be4b50ef6ccedf9d3c6639b056a967a86258c6d9200643f01269211ca

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl
Algorithm Hash digest
SHA256 df6c4265b289083bf710dff49bc51ef252f9d5be33a45ee2bed151114a56207b
MD5 ceaab51cace4bad34afaa9db6af60a19
BLAKE2b-256 6404ca2363f0bfbe3b3d36e95bf67e56a4c88c8e3362b658e616d1ac185d47f2

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
Algorithm Hash digest
SHA256 1c774b1276f71e1ef716e5486f21e76333464f47bece56bbd554485982a9e03e
MD5 551a566fab7ddb2eb91bc99f803d191d
BLAKE2b-256 e0fa89f4cb9e08df770b57adb96f8cbb7e22695a4cb6c2bd5f0c4f0ebcf33b66

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
Algorithm Hash digest
SHA256 bfb88f22a209ff7b40a576d5324bf8286b519d7358663db21d6246fb17eea2d5
MD5 b857498ca071cd01a07ee057c33b4596
BLAKE2b-256 4750b3ebb4243e7160bda8d34b731e54dd8ab8b133e50775872e7a434e524c28

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
Algorithm Hash digest
SHA256 25b85325d0815e86e0bac263506dd114578953b7b53d7de09a6485e4a160a7dd
MD5 07af85c673d32fb841ccbea5caec4437
BLAKE2b-256 785909d0d9ba94dcd5f4f1368d4858d24546b4bdc0231c2354aa31d6199f0399

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 2249487018adec45d6e3554c71d46eb39fa8ea67156c640f7513eb26f318cec7
MD5 cd58922a7aed4d3d645a4ea21d329153
BLAKE2b-256 d6847990e799f1309a8b87af6b948f31edaa12a3ed22d11b352eaf4f4b2e5753

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1e418a55456beedca4621dbab65a318981467a2b188e982a23e117f115ce5001
MD5 ad89a5e06baf1e6fae5efa6ca6041fb3
BLAKE2b-256 2e47174dca0502ef88b28f1c9e06b73ce33500eedfac7a7692108aec220464e7

See more details on using hashes here.

File details

Details for the file tokenizers-0.22.2-cp39-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tokenizers-0.22.2-cp39-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 544dd704ae7238755d790de45ba8da072e9af3eea688f698b137915ae959281c
MD5 a35452a83a811f2634a1efe68ed7b558
BLAKE2b-256 92975dbfabf04c7e348e655e907ed27913e03db0923abb5dfdd120d7b25630e1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page