Skip to main content

Python binding to Omikuji, an efficient implementation of Partioned Label Trees and its variations for extreme multi-label classification

Project description

Omikuji

Build Status Crate version PyPI version

An efficient implementation of Partitioned Label Trees (Prabhu et al., 2018) and its variations for extreme multi-label classification, written in Rust🦀 with love💖.

Features & Performance

Omikuji has has been tested on datasets from the Extreme Classification Repository. All tests below are run on a quad-core Intel® Core™ i7-6700 CPU, and we allowed as many cores to be utilized as possible. We measured training time, and calculated precisions at 1, 3, and 5. (Note that, due to randomness, results might vary from run to run, especially for smaller datasets.)

Parabel, better parallelized

Omikuji provides a more parallelized implementation of Parabel (Prabhu et al., 2018) that trains faster when more CPU cores are available. Compared to the original implementation written in C++, which can only utilize the same number of CPU cores as the number of trees (3 by default), Omikuji maintains the same level of precision but trains 1.3x to 1.7x faster on our quad-core machine. Further speed-up is possible if more CPU cores are available.

Dataset Metric Parabel Omikuji
(balanced,
cluster.k=2)
EURLex-4K P@1 82.2 82.1
P@3 68.8 68.8
P@5 57.6 57.7
Train Time 18s 14s
Amazon-670K P@1 44.9 44.8
P@3 39.8 39.8
P@5 36.0 36.0
Train Time 404s 234s
WikiLSHTC-325K P@1 65.0 64.8
P@3 43.2 43.1
P@5 32.0 32.1
Train Time 959s 659s

Regular k-means for shallow trees

Following Bonsai (Khandagale et al., 2019), Omikuji supports using regular k-means instead of balanced 2-means clustering for tree construction, which results in wider, shallower and unbalanced trees that train slower but have better precision. Comparing to the original Bonsai implementation, Omikuji also achieves the same precisions while training 2.6x to 4.6x faster on our quad-core machine. (Similarly, further speed-up is possible if more CPU cores are available.)

Dataset Metric Bonsai Omikuji
(unbalanced,
cluster.k=100,
max_depth=3)
EURLex-4K P@1 82.8 83.0
P@3 69.4 69.5
P@5 58.1 58.3
Train Time 87s 19s
Amazon-670K P@1 45.5* 45.6
P@3 40.3* 40.4
P@5 36.5* 36.6
Train Time 5,759s 1,753s
WikiLSHTC-325K P@1 66.6* 66.6
P@3 44.5* 44.4
P@5 33.0* 33.0
Train Time 11,156s 4,259s

*Precision numbers as reported in the paper; our machine doesn't have enough memory to run the full prediction with their implementation.

Balanced k-means for balanced shallow trees

Sometimes it's desirable to have shallow and wide trees that are also balanced, in which case Omikuji supports the balanced k-means algorithm used by HOMER (Tsoumakas et al., 2008) for clustering as well.

Dataset Metric Omikuji
(balanced,
cluster.k=100)
EURLex-4K P@1 82.1
P@3 69.4
P@5 58.1
Train Time 19s
Amazon-670K P@1 45.4
P@3 40.3
P@5 36.5
Train Time 1,153s
WikiLSHTC-325K P@1 65.6
P@3 43.6
P@5 32.5
Train Time 3,028s

Layer collapsing for balanced shallow trees

An alternative way for building balanced, shallow and wide trees is to collapse adjacent layers, similar to the tree compression step used in AttentionXML (You et al., 2019): intermediate layers are removed, and their children replace them as the children of their parents. For example, with balanced 2-means clustering, if we collapse 5 layers after each layer, we can increase the tree arity from 2 to 2⁵⁺¹ = 64.

Dataset Metric Omikuji
(balanced,
cluster.k=2,
collapse 5 layers)
EURLex-4K P@1 82.4
P@3 69.3
P@5 58.0
Train Time 16s
Amazon-670K P@1 45.3
P@3 40.2
P@5 36.4
Train Time 460s
WikiLSHTC-325K P@1 64.9
P@3 43.3
P@5 32.3
Train Time 1,649s

Build & Install

Omikuji can be easily built & installed with Cargo as a CLI app:

cargo install omikuji --features cli --locked

Or install from the latest source:

cargo install --git https://github.com/tomtung/omikuji.git --features cli --locked

The CLI app will be available as omikuji. For example, to reproduce the results on the EURLex-4K dataset:

omikuji train eurlex_train.txt --model_path ./model
omikuji test ./model eurlex_test.txt --out_path predictions.txt

Python Binding

A simple Python binding is also available for training and prediction. It can be install via pip:

pip install omikuji

Note that you might still need to install Cargo should compilation become necessary.

You can also install from the latest source:

pip install git+https://github.com/tomtung/omikuji.git -v

The following script demonstrates how to use the Python binding to train a model and make predictions:

import omikuji

# Train
hyper_param = omikuji.Model.default_hyper_param()
# Adjust hyper-parameters as needed
hyper_param.n_trees = 5
model = omikuji.Model.train_on_data("./eurlex_train.txt", hyper_param)

# Serialize & de-serialize
model.save("./model")
model = omikuji.Model.load("./model")
# Optionally densify model weights to trade off between prediction speed and memory usage
model.densify_weights(0.05)

# Predict
feature_value_pairs = [
    (0, 0.101468),
    (1, 0.554374),
    (2, 0.235760),
    (3, 0.065255),
    (8, 0.152305),
    (10, 0.155051),
    # ...
]
label_score_pairs =  model.predict(feature_value_pairs)

Usage

$ omikuji train --help
Train a new omikuji model

USAGE:
    omikuji train [OPTIONS] <TRAINING_DATA_PATH>

ARGS:
    <TRAINING_DATA_PATH>
            Path to training dataset file

            The dataset file is expected to be in the format of the Extreme Classification
            Repository.

OPTIONS:
        --centroid_threshold <THRESHOLD>
            Threshold for pruning label centroid vectors

            [default: 0]

        --cluster.eps <CLUSTER_EPS>
            Epsilon value for determining linear classifier convergence

            [default: 0.0001]

        --cluster.k <K>
            Number of clusters

            [default: 2]

        --cluster.min_size <MIN_SIZE>
            Labels in clusters with sizes smaller than this threshold are reassigned to other
            clusters instead

            [default: 2]

        --cluster.unbalanced
            Perform regular k-means clustering instead of balanced k-means clustering

        --collapse_every_n_layers <N_LAYERS>
            Number of adjacent layers to collapse

            This increases tree arity and decreases tree depth.

            [default: 0]

    -h, --help
            Print help information

        --linear.c <C>
            Cost coefficient for regularizing linear classifiers

            [default: 1]

        --linear.eps <LINEAR_EPS>
            Epsilon value for determining linear classifier convergence

            [default: 0.1]

        --linear.loss <LOSS>
            Loss function used by linear classifiers

            [default: hinge]
            [possible values: hinge, log]

        --linear.max_iter <M>
            Max number of iterations for training each linear classifier

            [default: 20]

        --linear.weight_threshold <MIN_WEIGHT>
            Threshold for pruning weight vectors of linear classifiers

            [default: 0.1]

        --max_depth <DEPTH>
            Maximum tree depth

            [default: 20]

        --min_branch_size <SIZE>
            Number of labels below which no further clustering & branching is done

            [default: 100]

        --model_path <MODEL_PATH>
            Optional path of the directory where the trained model will be saved if provided

            If an model with compatible settings is already saved in the given directory, the newly
            trained trees will be added to the existing model")

        --n_threads <N_THREADS>
            Number of worker threads

            If 0, the number is selected automatically.

            [default: 0]

        --n_trees <N_TREES>
            Number of trees

            [default: 3]

        --train_trees_1_by_1
            Finish training each tree before start training the next

            This limits initial parallelization but saves memory.

        --tree_structure_only
            Build the trees without training classifiers

            Might be useful when a downstream user needs the tree structures only.
$ omikuji test --help
Test an existing omikuji model

USAGE:
    omikuji test [OPTIONS] <MODEL_PATH> <TEST_DATA_PATH>

ARGS:
    <MODEL_PATH>
            Path of the directory where the trained model is saved

    <TEST_DATA_PATH>
            Path to test dataset file

            The dataset file is expected to be in the format of the Extreme Classification
            Repository.

OPTIONS:
        --beam_size <BEAM_SIZE>
            Beam size for beam search

            [default: 10]

    -h, --help
            Print help information

        --k_top <K>
            Number of top predictions to write out for each test example

            [default: 5]

        --max_sparse_density <DENSITY>
            Density threshold above which sparse weight vectors are converted to dense format

            Lower values speed up prediction at the cost of more memory usage.

            [default: 0.1]

        --n_threads <N_THREADS>
            Number of worker threads

            If 0, the number is selected automatically.

            [default: 0]

        --out_path <OUT_PATH>
            Path to the which predictions will be written, if provided

Data format

Our implementation takes dataset files formatted as those provided in the Extreme Classification Repository. A data file starts with a header line with three space-separated integers: total number of examples, number of features, and number of labels. Following the header line, there is one line per each example, starting with comma-separated labels, followed by space-separated feature:value pairs:

label1,label2,...labelk ft1:ft1_val ft2:ft2_val ft3:ft3_val .. ftd:ftd_val

Trivia

The project name comes from o-mikuji (御神籤), which are predictions about one's future written on strips of paper (labels?) at jinjas and temples in Japan, often tied to branches of pine trees after they are read.

References

  • Y. Prabhu, A. Kag, S. Harsola, R. Agrawal, and M. Varma, “Parabel: Partitioned Label Trees for Extreme Classification with Application to Dynamic Search Advertising,” in Proceedings of the 2018 World Wide Web Conference, 2018, pp. 993–1002.
  • S. Khandagale, H. Xiao, and R. Babbar, “Bonsai - Diverse and Shallow Trees for Extreme Multi-label Classification,” Apr. 2019.
  • G. Tsoumakas, I. Katakis, and I. Vlahavas, “Effective and efficient multilabel classification in domains with large number of labels,” ECML, 2008.
  • R. You, S. Dai, Z. Zhang, H. Mamitsuka, and S. Zhu, “AttentionXML: Extreme Multi-Label Text Classification with Multi-Label Attention Based Recurrent Neural Networks,” Jun. 2019.

License

Omikuji is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omikuji-0.5.1.tar.gz (54.3 kB view details)

Uploaded Source

Built Distributions

omikuji-0.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

omikuji-0.5.1-cp312-cp312-macosx_11_0_universal2.whl (527.1 kB view details)

Uploaded CPython 3.12 macOS 11.0+ universal2 (ARM64, x86-64)

omikuji-0.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

omikuji-0.5.1-cp311-cp311-macosx_11_0_universal2.whl (527.1 kB view details)

Uploaded CPython 3.11 macOS 11.0+ universal2 (ARM64, x86-64)

omikuji-0.5.1-cp310-cp310-win_amd64.whl (390.4 kB view details)

Uploaded CPython 3.10 Windows x86-64

omikuji-0.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

omikuji-0.5.1-cp310-cp310-macosx_11_0_universal2.whl (527.1 kB view details)

Uploaded CPython 3.10 macOS 11.0+ universal2 (ARM64, x86-64)

omikuji-0.5.1-cp39-cp39-win_amd64.whl (390.4 kB view details)

Uploaded CPython 3.9 Windows x86-64

omikuji-0.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

omikuji-0.5.1-cp39-cp39-macosx_11_0_x86_64.whl (527.1 kB view details)

Uploaded CPython 3.9 macOS 11.0+ x86-64

omikuji-0.5.1-cp38-cp38-win_amd64.whl (390.4 kB view details)

Uploaded CPython 3.8 Windows x86-64

omikuji-0.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

omikuji-0.5.1-cp38-cp38-macosx_11_0_x86_64.whl (527.1 kB view details)

Uploaded CPython 3.8 macOS 11.0+ x86-64

File details

Details for the file omikuji-0.5.1.tar.gz.

File metadata

  • Download URL: omikuji-0.5.1.tar.gz
  • Upload date:
  • Size: 54.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1.tar.gz
Algorithm Hash digest
SHA256 ce93a4f7677e80378c974a9d15462ce76dddd40716b0518738cd5a99cee782ae
MD5 c39d6896f5ddfdd106699d3771225e90
BLAKE2b-256 eed64580f6b668dc0aae354f574eb870a6244487c1df3b419fcb7fb37234184c

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for omikuji-0.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2f84bc9d806b42a57462f6a5d45a22ff0616b0c905674e2a7b7a75cf783759b8
MD5 3d5bd2b02a647d7dbc65b83d9565129c
BLAKE2b-256 fcab3b789ad13c6a965247ea3a83391511e85acbd6ea6b78ae42ad1022961901

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp312-cp312-macosx_11_0_universal2.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp312-cp312-macosx_11_0_universal2.whl
  • Upload date:
  • Size: 527.1 kB
  • Tags: CPython 3.12, macOS 11.0+ universal2 (ARM64, x86-64)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp312-cp312-macosx_11_0_universal2.whl
Algorithm Hash digest
SHA256 c7a0a1c3d9f5d159fe298b208a7bf346102b7993be25230f74e50e848c6ca9aa
MD5 adffbe59a6f19351352a418557feab53
BLAKE2b-256 69b64d9e620695d9b2bb6fa25227f85f3f2c4cd3d626bb2a64675fd0308e15cf

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for omikuji-0.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6bdcc8308396be9c5314980c170a287de91e7f8be1505a02bf601a239a9861ba
MD5 7d651190d16aae703599292fad69548a
BLAKE2b-256 1b3760de92d081b72eb9edb296e0e55cb83d3faf0c6ed49e83345829328da2fe

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp311-cp311-macosx_11_0_universal2.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp311-cp311-macosx_11_0_universal2.whl
  • Upload date:
  • Size: 527.1 kB
  • Tags: CPython 3.11, macOS 11.0+ universal2 (ARM64, x86-64)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp311-cp311-macosx_11_0_universal2.whl
Algorithm Hash digest
SHA256 7d11bd41d0b5b4e44d7938306dc78d402768b50bf9239b1c5c49ca64230e32fa
MD5 0241d1527346fdb5c5f140aab0164944
BLAKE2b-256 564e2b2e40e2a3eb05c09ceb79c1a8a33d01899977897f9274ddafb29f2043a9

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 390.4 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 dae6a978d26eadb0f0567f614c08dff021ba2f27a21cdca998d4da390bd93f5d
MD5 ee0ed87948c7514ddb6d89dbc669dfff
BLAKE2b-256 54dd62d1ea3e004085c66508c0a067445c76e7b40c3237f1a246964b63b4b20e

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for omikuji-0.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ae132fce5a6a50e06306ab8b1624458d55445b75f85240387202f87a40afa2c4
MD5 ab755ce5a958618ea051364369f1372f
BLAKE2b-256 67c76381a95c964675af3a87e612fb4601c6025e76af8d436253e6327c59a86b

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp310-cp310-macosx_11_0_universal2.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp310-cp310-macosx_11_0_universal2.whl
  • Upload date:
  • Size: 527.1 kB
  • Tags: CPython 3.10, macOS 11.0+ universal2 (ARM64, x86-64)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp310-cp310-macosx_11_0_universal2.whl
Algorithm Hash digest
SHA256 c8688009056d9e862f13245eb2856e8eca5470601e4016ef5a3865a32bc91224
MD5 6d3f6c8aadbf673a26d5417d02b89510
BLAKE2b-256 c2cf14083c24fc2f99ef3a1907354d69bf472c22a8a1e9a8c7de613384e0f310

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 390.4 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 81fc93dcbdc652c915919999fe50c50c38f9a35678078e098155b0a23fa508a6
MD5 c020ee7f9553fece856550d5beeec34f
BLAKE2b-256 e08f3edf2e13463a03d02b96ad64dbe3cf8d34638ddb2ce0439063b07c41729c

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for omikuji-0.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 fc801b7902a2f7223cd109e6ed6ff0a5a88da58f3d956edcd0a33ae367cdadc8
MD5 666dca865044ebd92b9fc26bcb6d0aed
BLAKE2b-256 587bd1ae5406381a7a18f83bcfc5eeb4229aa14089958c4d074cf4a369532b85

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp39-cp39-macosx_11_0_x86_64.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp39-cp39-macosx_11_0_x86_64.whl
  • Upload date:
  • Size: 527.1 kB
  • Tags: CPython 3.9, macOS 11.0+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp39-cp39-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 f8b1cc769dca14219d2af0bfd7450a179222a6b711cac32fd9a97c1ca8703f3c
MD5 f805f96e34a09d8b2d61af31f3af3c28
BLAKE2b-256 87fc2a4f1565b8756e418f43cdbc1bdba861071331e2d6e0fe5d138e284804d1

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 390.4 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 6dee828ffee8ed5d08e41a503653d9361fcdd9f287f8c32e87de18d296056363
MD5 f2df24c2103ae1884978f967b012f9a6
BLAKE2b-256 be0f823244e8bcfcb5ea8c152baa94b11c57718f11a0d09a4e5aa2f6c5ffbf32

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for omikuji-0.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e614e0b803ad4b26473dd92acc8215364f511d8690b4022c02dbf8682e67066a
MD5 63422f6b5e1a1663917b0453f7b7ba86
BLAKE2b-256 802220b77d655dafccc1f72a9d4b1e51c68265b22758f0b012321020a4168d21

See more details on using hashes here.

File details

Details for the file omikuji-0.5.1-cp38-cp38-macosx_11_0_x86_64.whl.

File metadata

  • Download URL: omikuji-0.5.1-cp38-cp38-macosx_11_0_x86_64.whl
  • Upload date:
  • Size: 527.1 kB
  • Tags: CPython 3.8, macOS 11.0+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.8.2 requests/2.22.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for omikuji-0.5.1-cp38-cp38-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 7a7339bc420d2ad97fea39ffc411257caa496c33b31b7be393dfbf9e573916f4
MD5 623e483cddc7e85ad9329a909091d30b
BLAKE2b-256 460f61c09e4a1aa6e31d9c0b2a60107c040514adf7c55b392a3edcae99262505

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page