Skip to main content

Fast and Customizable Tokenizers

Project description



Build GitHub


Tokenizers

Provides an implementation of today's most used tokenizers, with a focus on performance and versatility.

Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there.

Otherwise, let's dive in!

Main features:

  • Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions).
  • Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU.
  • Easy to use, but also extremely versatile.
  • Designed for research and production.
  • Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token.
  • Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust installed:

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -y
export PATH="$HOME/.cargo/bin:$PATH"

Once Rust is installed, you can compile doing the following

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install setuptools_rust
python setup.py install

Using the provided Tokenizers

Using a pre-trained tokenizer is really simple:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
tokenizer = CharBPETokenizer(vocab, merges)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

And you can train yours just as simply:

from tokenizers import CharBPETokenizer

# Initialize a tokenizer
tokenizer = CharBPETokenizer()

# Then train it!
tokenizer.train([ "./path/to/files/1.txt", "./path/to/files/2.txt" ])

# And you can use it
encoded = tokenizer.encode("I can feel the magic, can you?")

# And finally save it somewhere
tokenizer.save("./path/to/directory", "my-bpe")

Provided Tokenizers

  • CharBPETokenizer: The original BPE
  • ByteLevelBPETokenizer: The byte level version of the BPE
  • SentencePieceBPETokenizer: A BPE implementation compatible with the one used by SentencePiece
  • BertWordPieceTokenizer: The famous Bert tokenizer, using WordPiece

All of these can be used and trained as explained above!

Build your own

You can also easily build your own tokenizers, by putting all the different parts you need together:

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, processors

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE.from_files(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded.ids)
print(encoded.tokens)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers, processors

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE.empty())

# Customize pre-tokenization and decoding
tokenizer.pre_tokenizer = pre_tokenizers.ByteLevel(add_prefix_space=True)
tokenizer.decoder = decoders.ByteLevel()
tokenizer.post_processor = processors.ByteLevel(trim_offsets=True)

# And then train
trainer = trainers.BpeTrainer(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.7.0rc2.tar.gz (78.8 kB view details)

Uploaded Source

Built Distributions

tokenizers-0.7.0rc2-cp38-cp38-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.7.0rc2-cp38-cp38-win32.whl (944.7 kB view details)

Uploaded CPython 3.8 Windows x86

tokenizers-0.7.0rc2-cp38-cp38-manylinux1_x86_64.whl (7.4 MB view details)

Uploaded CPython 3.8

tokenizers-0.7.0rc2-cp38-cp38-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.8 macOS 10.10+ x86-64

tokenizers-0.7.0rc2-cp37-cp37m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.7.0rc2-cp37-cp37m-win32.whl (944.7 kB view details)

Uploaded CPython 3.7m Windows x86

tokenizers-0.7.0rc2-cp37-cp37m-manylinux1_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.7m

tokenizers-0.7.0rc2-cp37-cp37m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.7m macOS 10.10+ x86-64

tokenizers-0.7.0rc2-cp36-cp36m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.7.0rc2-cp36-cp36m-win32.whl (945.0 kB view details)

Uploaded CPython 3.6m Windows x86

tokenizers-0.7.0rc2-cp36-cp36m-manylinux1_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.6m

tokenizers-0.7.0rc2-cp36-cp36m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.6m macOS 10.10+ x86-64

tokenizers-0.7.0rc2-cp35-cp35m-win_amd64.whl (1.1 MB view details)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.7.0rc2-cp35-cp35m-win32.whl (945.0 kB view details)

Uploaded CPython 3.5m Windows x86

tokenizers-0.7.0rc2-cp35-cp35m-manylinux1_x86_64.whl (1.9 MB view details)

Uploaded CPython 3.5m

tokenizers-0.7.0rc2-cp35-cp35m-macosx_10_10_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.5m macOS 10.10+ x86-64

File details

Details for the file tokenizers-0.7.0rc2.tar.gz.

File metadata

  • Download URL: tokenizers-0.7.0rc2.tar.gz
  • Upload date:
  • Size: 78.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2.tar.gz
Algorithm Hash digest
SHA256 cce53cebad631fc28b0c9a69adc00255a4f5df78f58b01e01709e1ed887578ff
MD5 c957520ea215d31899c95d7a279e2914
BLAKE2b-256 aba03e695f01fdb6ae9bd39cf382e270cdc127bfd85194ba279d41dccce76a80

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 7838b8ef5e01a9090b54bd24e98083aa729b28ad96bd85e2f5a62664d8444b46
MD5 f31ee8d987963b896d9b5430bc9b760e
BLAKE2b-256 8d80d9e9b85e8ae89911b0734b69be28d4652d839a6f30d7f93b7bbecd186d99

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp38-cp38-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp38-cp38-win32.whl
  • Upload date:
  • Size: 944.7 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 19b7f54df517b41746c37aad8708db1c17163e377a6aed797db651118afce283
MD5 c62763270a1107191fa25e4180cab389
BLAKE2b-256 330f289067cf65f9e2d196a6767154268bf27da4d1101c1d9658c2966435bfe1

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 7.4 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 bc98a81a094e2bed785478b021ecddd710cf923c5ef09b03fc69b767fd53fcff
MD5 4c6db62f7d41188f0e1493336271b5af
BLAKE2b-256 e9b6f027a5e5592f6803a70405603f6b74d5fb11e33c12415fd24a7559609d18

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp38-cp38-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp38-cp38-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.8, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp38-cp38-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 66b6d37419dda29b12aff11aa9629f938261b71c847a74ae5a652638ad38b8e9
MD5 be47e6b4c4afcb96d85108c2ed129691
BLAKE2b-256 2421232165f36ceb191c6ec664ee3b44da7f366e90450db533ecee88d14eb622

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 9b76cc86187290d56f7af1e9f6ab558183c4534bb8e3653aef524113b9873aa2
MD5 c0400a63584e25ff00426d223ac891f4
BLAKE2b-256 2cbc07c22fba054ad079821fec07ca2f0c3c323e50740317b284d26958575d8f

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp37-cp37m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 944.7 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 ca4326eb6b61d84f7f7d21eef2cf0f4a767ec884abb8a57e8a1879d8481982af
MD5 6d2ce0360e42b6f037324633515c6187
BLAKE2b-256 8c76a0a6491d0916a2e4b47d4a8db5dd4127188a8945e24f0b1cfdd2c50f4a2b

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 5.5 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 d8f95b7c756f187514adf053ffb03d7f2bf602f00fe6e9e5f8afadb3b4b1039f
MD5 4e662b4eea56223c70dcf5024ce5f2b9
BLAKE2b-256 85177cbb232d80e026ca113a9c01b244bb84710bc13d655e93e17323118b62b7

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp37-cp37m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp37-cp37m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.7m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp37-cp37m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 d2d9277ab507a998292f54473395ebb41bb898a9cabfa8c6a9b1153477243638
MD5 fa0d17af6b52f0f9f97798cc4cd7c5a6
BLAKE2b-256 dfd3e82d8a1d633177922b6ebc7cd06faea9edfd741fc394d16f7eff19bfea84

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 b570982f3be6460e5f3ce56ce4519ffe66b7f99cfd547015098ebc545331e0d3
MD5 690af0e7fa3acc2faf3d40cd47b36c81
BLAKE2b-256 c6368562b7e66e0e8ab6516269869c4b3f1163f6233fb8e56590519eefcd8ae3

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp36-cp36m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 945.0 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 2298db2e5986f0d2d6775bcace47a7633c552a59b7e06b34e4507e0017278551
MD5 543b4b3eecd8476cc81af602ba82669a
BLAKE2b-256 3802a2de93416eed0ca920a675bcbcdc860f128570073b5b5c0d540fe90a30c5

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.7 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 8d51be46845bc440e88f53aac5f65024069a5aa934746803f186cecb61226755
MD5 6053bef091a5a0da84ad8f4c353bc04f
BLAKE2b-256 4382a8ce8839a4dca62794bc7c44113524c78ec43522a4fd79f55a5786e87b5e

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp36-cp36m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp36-cp36m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.6m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp36-cp36m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 d48542bd29b6f2fe477970ff474a5eb0a9a8a6452eac6c8eec760656ab9f8163
MD5 03f6948a7fd9a1ba04a74c3148928380
BLAKE2b-256 b3d1b6b4dbb6996fd718aabf7a8a5176b1ad129ee1373cd7f673ba39eb9ecba4

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp35-cp35m-win_amd64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp35-cp35m-win_amd64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.5m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp35-cp35m-win_amd64.whl
Algorithm Hash digest
SHA256 56bd0b899a5ba001d230a45ee894fd6740b8ff4aeefe268d30f797e075173ef9
MD5 6aadedf76cb714bb641e69ecd223ac00
BLAKE2b-256 e9b68eb279d421bdcf5b3d3bd161d106f620d81c54a0353904fe086faac8fb47

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp35-cp35m-win32.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp35-cp35m-win32.whl
  • Upload date:
  • Size: 945.0 kB
  • Tags: CPython 3.5m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp35-cp35m-win32.whl
Algorithm Hash digest
SHA256 8ea12f04410b5a6e3652773e9e0dbc0ddcb3a194e126d9aa1b8d5cf158f9552b
MD5 b1f7714008f63dc73c6e893f1d0051d7
BLAKE2b-256 16e94f9b2bd052c26c76f44320bd7ed3714a4e9c6d70efc4db40a9686f88b709

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp35-cp35m-manylinux1_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp35-cp35m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 1.9 MB
  • Tags: CPython 3.5m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp35-cp35m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 07c0012797ccd5cd0323d51dec5c26f3d2e8ead26dd10a83a2053242cc630383
MD5 255d8fce80bf008ae0307780ff752d80
BLAKE2b-256 ad89b96b77149c002a82155ad7ff701a256e53ef6ba9e0263e704d94b3679dc5

See more details on using hashes here.

Provenance

File details

Details for the file tokenizers-0.7.0rc2-cp35-cp35m-macosx_10_10_x86_64.whl.

File metadata

  • Download URL: tokenizers-0.7.0rc2-cp35-cp35m-macosx_10_10_x86_64.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: CPython 3.5m, macOS 10.10+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.8.2

File hashes

Hashes for tokenizers-0.7.0rc2-cp35-cp35m-macosx_10_10_x86_64.whl
Algorithm Hash digest
SHA256 a1794987ef9ad0ce2659779effa94f8b8aec5146ae44189cf019e009c1e5b4b1
MD5 c715f89df4d70fa741f6a43ac36fc989
BLAKE2b-256 5201828b2f8b6370d49114a646618a507b05736af8773f1fa46fa86601ab532d

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page