Skip to main content

Practical Machine Learning for NLP

Project description

Thinc: Practical Machine Learning for NLP in Python

Thinc is the machine learning library powering spaCy. It features a battle-tested linear model designed for large sparse learning problems, and a flexible neural network model under development for spaCy v2.0.

Thinc is a practical toolkit for implementing models that follow the "Embed, encode, attend, predict" architecture. It's designed to be easy to install, efficient for CPU usage and optimised for NLP and deep learning with text – in particular, hierarchically structured input and variable-length sequences.

🔮 Read the release notes here.

Azure Pipelines Current Release Version PyPi Version conda Version Python wheels

What's where (as of v7.0.0)

Module Description
thinc.v2v.Model Base class.
thinc.v2v Layers transforming vectors to vectors.
thinc.i2v Layers embedding IDs to vectors.
thinc.t2v Layers pooling tensors to vectors.
thinc.t2t Layers transforming tensors to tensors (e.g. CNN, LSTM).
thinc.api Higher-order functions, for building networks. Will be renamed.
thinc.extra Datasets and utilities.
thinc.neural.ops Container classes for mathematical operations. Will be reorganized.
thinc.linear.avgtron Legacy efficient Averaged Perceptron implementation.

Development status

Thinc's deep learning functionality is still under active development: APIs are unstable, and we're not yet ready to provide usage support. However, if you're already quite familiar with neural networks, there's a lot here you might find interesting. Thinc's conceptual model is quite different from TensorFlow's. Thinc also implements some novel features, such as a small DSL for concisely wiring up models, embedding tables that support pre-computation and the hashing trick, dynamic batch sizes, a concatenation-based approach to variable-length sequences, and support for model averaging for the Adam solver (which performs very well).

No computational graph – just higher order functions

The central problem for a neural network implementation is this: during the forward pass, you compute results that will later be useful during the backward pass. How do you keep track of this arbitrary state, while making sure that layers can be cleanly composed?

Most libraries solve this problem by having you declare the forward computations, which are then compiled into a graph somewhere behind the scenes. Thinc doesn't have a "computational graph". Instead, we just use the stack, because we put the state from the forward pass into callbacks.

All nodes in the network have a simple signature:

f(inputs) -> {outputs, f(d_outputs)->d_inputs}

To make this less abstract, here's a ReLu activation, following this signature:

def relu(inputs):
    mask = inputs > 0
    def backprop_relu(d_outputs, optimizer):
        return d_outputs * mask
    return inputs * mask, backprop_relu

When you call the relu function, you get back an output variable, and a callback. This lets you calculate a gradient using the output, and then pass it into the callback to perform the backward pass.

This signature makes it easy to build a complex network out of smaller pieces, using arbitrary higher-order functions you can write yourself. To make this clearer, we need a function for a weights layer. Usually this will be implemented as a class — but let's continue using closures, to keep things concise, and to keep the simplicity of the interface explicit.

The main complication for the weights layer is that we now have a side-effect to manage: we would like to update the weights. There are a few ways to handle this. In Thinc we currently pass a callable into the backward pass. (I'm not convinced this is best.)

import numpy

def create_linear_layer(n_out, n_in):
    W = numpy.zeros((n_out, n_in))
    b = numpy.zeros((n_out, 1))

    def forward(X):
        Y = W @ X + b
        def backward(dY, optimizer):
            dX = W.T @ dY
            dW = numpy.einsum('ik,jk->ij', dY, X)
            db = dY.sum(axis=0)

            optimizer(W, dW)
            optimizer(b, db)

            return dX
        return Y, backward
    return forward

If we call Wb = create_linear_layer(5, 4), the variable Wb will be the forward() function, implemented inside the body of create_linear_layer(). The Wb instance will have access to the W and b variable defined in its outer scope. If we invoke create_linear_layer() again, we get a new instance, with its own internal state.

The Wb instance and the relu function have exactly the same signature. This makes it easy to write higher order functions to compose them. The most obvious thing to do is chain them together:

def chain(*layers):
    def forward(X):
        backprops = []
        Y = X
        for layer in layers:
            Y, backprop = layer(Y)
            backprops.append(backprop)
        def backward(dY, optimizer):
            for backprop in reversed(backprops):
                dY = backprop(dY, optimizer)
            return dY
        return Y, backward
    return forward

We could now chain our linear layer together with the relu activation, to create a simple feed-forward network:

Wb1 = create_linear_layer(10, 5)
Wb2 = create_linear_layer(3, 10)

model = chain(Wb1, relu, Wb2)

X = numpy.random.uniform(size=(5, 4))

y, bp_y = model(X)

dY = y - truth
dX = bp_y(dY, optimizer)

This conceptual model makes Thinc very flexible. The trade-off is that Thinc is less convenient and efficient at workloads that fit exactly into what TensorFlow etc. are designed for. If your graph really is static, and your inputs are homogenous in size and shape, Keras will likely be faster and simpler. But if you want to pass normal Python objects through your network, or handle sequences and recursions of arbitrary length or complexity, you might find Thinc's design a better fit for your problem.

Quickstart

Thinc should install cleanly with both pip and conda, for Pythons 2.7+ and 3.5+, on Linux, macOS / OSX and Windows. Its only system dependency is a compiler tool-chain (e.g. build-essential) and the Python development headers (e.g. python-dev).

pip install -U pip setuptools wheel
pip install thinc

For GPU support, we're grateful to use the work of Chainer's cupy module, which provides a numpy-compatible interface for GPU arrays. However, installing Chainer when no GPU is available currently causes an error. We therefore do not list cupy as an explicit dependency — so cupy is installed using an extra option with the correct CUDA version:

pip install thinc[cuda102]

Alternatively, to install cupy from source use thinc[cuda] or install cupy directly using its source package with pip install cupy.

The rest of this section describes how to build Thinc from source. If you have Fabric installed, you can use the shortcut:

git clone https://github.com/explosion/thinc
cd thinc
fab clean env make test

You can then run the examples as follows:

fab eg.mnist
fab eg.basic_tagger
fab eg.cnn_tagger

Otherwise, you can build and test explicitly with:

git clone https://github.com/explosion/thinc
cd thinc

virtualenv .env
source .env/bin/activate

pip install -r requirements.txt
python setup.py build_ext --inplace
py.test thinc/

And then run the examples as follows:

python examples/mnist.py
python examples/basic_tagger.py
python examples/cnn_tagger.py

Usage

The Neural Network API is still subject to change, even within minor versions. You can get a feel for the current API by checking out the examples. Here are a few quick highlights.

1. Shape inference

Models can be created with some dimensions unspecified. Missing dimensions are inferred when pre-trained weights are loaded or when training begins. This eliminates a common source of programmer error:

# Invalid network — shape mismatch
model = chain(ReLu(512, 748), ReLu(512, 784), Softmax(10))

# Leave the dimensions unspecified, and you can't be wrong.
model = chain(ReLu(512), ReLu(512), Softmax())

2. Operator overloading

The Model.define_operators() classmethod allows you to bind arbitrary binary functions to Python operators, for use in any Model instance. The method can (and should) be used as a context-manager, so that the overloading is limited to the immediate block. This allows concise and expressive model definition:

with Model.define_operators({'>>': chain}):
    model = ReLu(512) >> ReLu(512) >> Softmax()

The overloading is cleaned up at the end of the block. A fairly arbitrary zoo of functions are currently implemented. Some of the most useful:

  • chain(model1, model2): Compose two models f(x) and g(x) into a single model computing g(f(x)).
  • clone(model1, int): Create n copies of a model, each with distinct weights, and chain them together.
  • concatenate(model1, model2): Given two models with output dimensions (n,) and (m,), construct a model with output dimensions (m+n,).
  • add(model1, model2): add(f(x), g(x)) = f(x)+g(x)
  • make_tuple(model1, model2): Construct tuples of the outputs of two models, at the batch level. The backward pass expects to receive a tuple of gradients, which are routed through the appropriate model, and summed.

Putting these things together, here's the sort of tagging model that Thinc is designed to make easy.

with Model.define_operators({'>>': chain, '**': clone, '|': concatenate}):
    model = (
        add_eol_markers('EOL')
        >> flatten
        >> memoize(
            CharLSTM(char_width)
            | (normalize >> str2int >> Embed(word_width)))
        >> ExtractWindow(nW=2)
        >> BatchNorm(ReLu(hidden_width)) ** 3
        >> Softmax()
    )

Not all of these pieces are implemented yet, but hopefully this shows where we're going. The memoize function will be particularly important: in any batch of text, the common words will be very common. It's therefore important to evaluate models such as the CharLSTM once per word type per minibatch, rather than once per token.

3. Callback-based backpropagation

Most neural network libraries use a computational graph abstraction. This takes the execution away from you, so that gradients can be computed automatically. Thinc follows a style more like the autograd library, but with larger operations. Usage is as follows:

def explicit_sgd_update(X, y):
    sgd = lambda weights, gradient: weights - gradient * 0.001
    yh, finish_update = model.begin_update(X, drop=0.2)
    finish_update(y-yh, sgd)

Separating the backpropagation into three parts like this has many advantages. The interface to all models is completely uniform — there is no distinction between the top-level model you use as a predictor and the internal models for the layers. We also make concurrency simple, by making the begin_update() step a pure function, and separating the accumulation of the gradient from the action of the optimizer.

4. Class annotations

To keep the class hierarchy shallow, Thinc uses class decorators to reuse code for layer definitions. Specifically, the following decorators are available:

  • describe.attributes(): Allows attributes to be specified by keyword argument. Used especially for dimensions and parameters.
  • describe.on_init(): Allows callbacks to be specified, which will be called at the end of the __init__.py.
  • describe.on_data(): Allows callbacks to be specified, which will be called on Model.begin_training().

🛠 Changelog

Version Date Description
v7.3.1 2019-10-30 Relax dependecy requirements
v7.3.0 2019-10-28 Mish activation and experimental optimizers
v7.2.0 2019-10-20 Simpler GPU install and bug fixes
v7.1.1 2019-09-10 Support preshed v3.0.0
v7.1.0 2019-08-23 Support other CPUs, read-only arrays
v7.0.8 2019-07-11 Fix version for PyPi
v7.0.7 2019-07-11 Avoid allocating a negative shape for ngrams
v7.0.6 2019-07-11 Fix LinearModel regression
v7.0.5 2019-07-10 Bug fixes for pickle, threading, unflatten and consistency
v7.0.4 2019-03-19 Don't require thinc_gpu_ops
v7.0.3 2019-03-15 Fix pruning in beam search
v7.0.2 2019-02-23 Fix regression in linear model class
v7.0.1 2019-02-16 Fix import errors
v7.0.0 2019-02-15 Overhaul package dependencies
v6.12.1 2018-11-30 Fix msgpack pin
v6.12.0 2018-10-15 Wheels and separate GPU ops
v6.10.3 2018-07-21 Python 3.7 support and dependency updates
v6.11.2 2018-05-21 Improve GPU installation
v6.11.1 2018-05-20 Support direct linkage to BLAS libraries
v6.11.0 2018-03-16 n/a
v6.10.2 2017-12-06 Efficiency improvements and bug fixes
v6.10.1 2017-11-15 Fix GPU install and minor memory leak
v6.10.0 2017-10-28 CPU efficiency improvements, refactoring
v6.9.0 2017-10-03 Reorganize layers, bug fix to Layer Normalization
v6.8.2 2017-09-26 Fix packaging of gpu_ops
v6.8.1 2017-08-23 Fix Windows support
v6.8.0 2017-07-25 SELU layer, attention, improved GPU/CPU compatibility
v6.7.3 2017-06-05 Fix convolution on GPU
v6.7.2 2017-06-02 Bug fixes to serialization
v6.7.1 2017-06-02 Improve serialization
v6.7.0 2017-06-01 Fixes to serialization, hash embeddings and flatten ops
v6.6.0 2017-05-14 Improved GPU usage and examples
v6.5.2 2017-03-20 n/a
v6.5.1 2017-03-20 Improved linear class and Windows fix
v6.5.0 2017-03-11 Supervised similarity, fancier embedding and improvements to linear model
v6.4.0 2017-02-15 n/a
v6.3.0 2017-01-25 Efficiency improvements, argument checking and error messaging
v6.2.0 2017-01-15 Improve API and introduce overloaded operators
v6.1.3 2017-01-10 More neural network functions and training continuation
v6.1.2 2017-01-09 n/a
v6.1.1 2017-01-09 n/a
v6.1.0 2017-01-09 n/a
v6.0.0 2016-12-31 Add thinc.neural for NLP-oriented deep learning

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

thinc-7.4.6.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

thinc-7.4.6-cp311-cp311-win_amd64.whl (806.8 kB view details)

Uploaded CPython 3.11 Windows x86-64

thinc-7.4.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (979.1 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ ARM64

thinc-7.4.6-cp311-cp311-macosx_11_0_arm64.whl (887.8 kB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

thinc-7.4.6-cp311-cp311-macosx_10_9_x86_64.whl (978.4 kB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

thinc-7.4.6-cp310-cp310-win_amd64.whl (812.4 kB view details)

Uploaded CPython 3.10 Windows x86-64

thinc-7.4.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (983.3 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ ARM64

thinc-7.4.6-cp310-cp310-macosx_11_0_arm64.whl (898.8 kB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

thinc-7.4.6-cp310-cp310-macosx_10_9_x86_64.whl (988.6 kB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

thinc-7.4.6-cp39-cp39-win_amd64.whl (830.6 kB view details)

Uploaded CPython 3.9 Windows x86-64

thinc-7.4.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ ARM64

thinc-7.4.6-cp39-cp39-macosx_11_0_arm64.whl (908.5 kB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

thinc-7.4.6-cp39-cp39-macosx_10_9_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

thinc-7.4.6-cp38-cp38-win_amd64.whl (831.4 kB view details)

Uploaded CPython 3.8 Windows x86-64

thinc-7.4.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ ARM64

thinc-7.4.6-cp38-cp38-macosx_11_0_arm64.whl (900.3 kB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

thinc-7.4.6-cp38-cp38-macosx_10_9_x86_64.whl (989.7 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

thinc-7.4.6-cp37-cp37m-win_amd64.whl (818.8 kB view details)

Uploaded CPython 3.7m Windows x86-64

thinc-7.4.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (995.5 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ ARM64

thinc-7.4.6-cp37-cp37m-macosx_10_9_x86_64.whl (978.2 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

thinc-7.4.6-cp36-cp36m-win_amd64.whl (910.2 kB view details)

Uploaded CPython 3.6m Windows x86-64

thinc-7.4.6-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

thinc-7.4.6-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.17+ ARM64

File details

Details for the file thinc-7.4.6.tar.gz.

File metadata

  • Download URL: thinc-7.4.6.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6.tar.gz
Algorithm Hash digest
SHA256 6e592a21382287114b5ba412a2d2b8c20d73989fec78edea82e7151d03e4f627
MD5 0768b553342baf62c4ef154f72c42904
BLAKE2b-256 0689b846aaee5ca9840769f633aa3c148a403645a16c01acbde4c3b5faa6cafc

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 806.8 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 58ba8e740468d849cf6a7b51b5d6db88064305698d585eab536200eba228a853
MD5 504df0907204457c72021595ca05f326
BLAKE2b-256 85447a3cbb629dbe7a61c3e9a774abd681120b15b468430fabc51a46585a7be1

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 744b2ea0e4c193f6bb575cd0e50a48d0b5222b49727ec57dc14e4c4ee3d0fb1a
MD5 5bca9be20bcbdab6856e1fc09b6c366a
BLAKE2b-256 46cb78e052f4b18a3417e86f1dafa938dba33570c75601b11335dec7ccdaa49f

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 32d0cf33b970b2c6ff9aa146c6f12cb5b705c1455dbad759c24e94aeb5544246
MD5 bc6a62368a5effc7a242f7bbc8a6479a
BLAKE2b-256 117e34a6e520dae724876e498447981a5cf4a6f0b40bce7ab580d8fa63c50766

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a61c39afe4cda53f470519693438b6c673e79cc984c4f4899dcfcc91cc6668d2
MD5 0868abbaebaa7fd5bb5e9963215c256f
BLAKE2b-256 14234e800adb6de011d8bf8dfd0a0de24a027f0272b5b397ac121a1adef4ee3e

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 56f2a5b7fafc11c625a9236c32d73939238d0997631cffbc6dd5c5b501d10478
MD5 0f1fc34b412c9d93e897df164250e907
BLAKE2b-256 db7827b2d6004e53a6669236417de0f29e3b98a2b6e260bde21cb32f9320cecb

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 812.4 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 8dda058266cc27d8299324c6f42736314c11ed49932c2c4b1ee9b34c9078677d
MD5 92c8c3b01be603a3da33ba4b9cfd28f1
BLAKE2b-256 748c87989a2dd9e48a8549fde1d85b432ab91efc8cb86eed490674255fbd43ee

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3dc09c1061a4570255ce2438dc91def85b02a2fa380e725fd7ae9d151414a593
MD5 71fc30c1cbc832634b0f82e320a4379b
BLAKE2b-256 d1236e99532cf2efb80968d82ca99c138a6676a5f8d1bd8728be7313894b66f2

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 fdb71050ee51d711798a657684f47ae3abdc59b123c52361f1becb15d1f07d5c
MD5 2a7113c584aa56636659e7a656f31453
BLAKE2b-256 4736ef89939ca84816c5e627c24ece74de7bed72e6b4a88c584e54c04a4cdb73

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ef57def77166b4f1019425207b3d9eea181b8118b6c75d5b2d3a70a259844e14
MD5 31bc55684f7e8212163302cde19a3e53
BLAKE2b-256 157b74557a88d90694aa9886393eea309ef5628f3ec4b451f82315ec0abde4f1

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c3f91a29b0f458f943d3248dada7ab26fd40fbb0069dd6a285ad3283a9add0a5
MD5 7b26b9f5f30cf6f31948892f9573cdd7
BLAKE2b-256 8ffbe94058ac34569490dd6ebd8e1eab69d1e6f67ba8f6fdaefed6ca1c8e5d03

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 830.6 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 a27d057a0b3c0e3421f506d3e5fbe4a7221cacd1c783ba881a2d48e352179fed
MD5 9fbe7f2ceca7f2e2d7d3a17f1adec1a5
BLAKE2b-256 8d8e5094e796b26709d3ac588b153ead47b567b0d77f79c7552833dc54377d6e

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 364e97f7ed49d361bd50f091616b083dcfbf2a373b7da72f364c2c4b5f4280a0
MD5 fe3aa3464169baf0f85a6d9efdce140d
BLAKE2b-256 db0e0d12af328ba9f7baa8a13db75c40ba9abcac45c5cf433f0883829021d4ee

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 917aa55a5cb550f5084f9ac8259a2726bf4dd6f524d7de566dc27ee061eb9925
MD5 a15906f977e97475e84a52a5e8ea9434
BLAKE2b-256 832a748f7c709390729b9a7609462db3d9f6e454b7d05099b25a541f4e6e6385

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3e67fc8d44335aae38b4c749c118cd7c92e4e4895a0802c11d87efbbc06943fd
MD5 4b4e84c4b69212ca91e47708c45515c0
BLAKE2b-256 465a5558b1f17926b7109e267d1e543023be439095aacb21f85396be1a3cca25

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ed9c840e2f9edd053715953108d09982854b55695f46986b29e67df3f35bef58
MD5 2e80cfed8ddf3852c3f5a329a04593da
BLAKE2b-256 2f6bc6dabceee3d3a50eb1c8eb1034293eeaaaef4fc77baf85459cb48fd93d68

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 831.4 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 3f3198d34661f3b25af483ae39cde6aaae7dcf17cced91804f25964e1b69685f
MD5 75955852c6c01cfe618087654d738e13
BLAKE2b-256 afb843d125c96e5bf018696ea1a8792ff03869a817bc805ab2d7a93816ea1011

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e01a0aa93ac6b7cc51ad332818fec5a71f4afe451ee0edd65e7bf427b6077078
MD5 8a439a0ef140bf7d9156ec40a8e70815
BLAKE2b-256 20cd4491b61acd7952b71e70a66a142b321dc35047fffd7a71e920d41dcfcb79

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 a5429eba74b281c94314ec40b6a209d701eb31a868b6b6775df34f085e8b8612
MD5 5964fcf233525616d1277c6f041fe948
BLAKE2b-256 5a80071748045183fbfd112ddf7b46f3db40f100a711ef716b2e41c66c0c1e43

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 38983924746eb13d9b54942955fecd6428ec862046484bc7d2ea2a9a9a04a1a2
MD5 5a1a7b962f5b4c0d860489b6e641a9c8
BLAKE2b-256 68b82038b742f79919664ed92e104bdbf54c7504eb32b3ca0cc9231881bbca6c

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 f11259a26525fd8c9b0f3f45cfbfc73d9472db8803860d73b5a82b5a87f822d7
MD5 aa32ba8752867f426abe57eb2fbafd03
BLAKE2b-256 73c6a6c6b7bb314982051f21ca75037b99588e364abc1c6b734039998d9581c7

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 818.8 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 4da522550fbb8bc51ceacf0d68d5b9b108071d19a10374a6f2938b56fb0b9808
MD5 3617ee0d401e153827f503808d0dcde9
BLAKE2b-256 54fd191132fc74e8390eb6a6a27978c7b506b2a62f1af9fea4b36299a9d63ba6

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 136bdd3af9787b4a41ae6166fa963ce1f4d108bc823fa68b52b64e04ae30af3a
MD5 395fbb61c77e3754c0f85908f1f00a02
BLAKE2b-256 a83b8b8d3982e0fc74a4c2837d7d3dd8ca389ca85ae493694cab4bd77c05d762

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 16dee8f90c35f2f1a9bd17a9b18d7be07f150e212a75740840494d4a95e3f7e8
MD5 90b471e2c92b823a42cf95fbc30bfc7a
BLAKE2b-256 ea23cb8dc5749fa18bead12f45363b858b425c5ab8d4a291dead54bb9afddfd7

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 cd20696a3f42baf63628e4fc018d33880ea781320cec91e2a425fafefd4fc24b
MD5 a40bcd8155482fb511937c952846d2ed
BLAKE2b-256 f21cd09d9ebe1416cf887f20717ef8626a55109103e73ab526b225c9aaa4dc41

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.6-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 910.2 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for thinc-7.4.6-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 6b0d6e29b69ecd04e2f8b4aecd0ea3225b5d44037de58e468d03d754756f1627
MD5 418335dbee3ee58ee8915760bfbbae68
BLAKE2b-256 8375c86b1e1a77d14813e2994b99fb523aa00ed66e201a7a99543bea5ea2b606

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 945bdfb6893934f944db92b78e698a78854cb230a6429afb495f1c2225049651
MD5 7102db2494d9ea6004f2497008d41d97
BLAKE2b-256 6cdb18b0f5b48a9815f1c01007613621760ac49a54a991ab5ce763e973bc278e

See more details on using hashes here.

File details

Details for the file thinc-7.4.6-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for thinc-7.4.6-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 4d2340888ae3e1cf0910b1cb860b9deeaba03dc808b8430f154fb79bfab74b4d
MD5 e8b64cb52cccb422a499f0967a9c94cc
BLAKE2b-256 c5a98760dbcb32ff85380e3b75d42dd30b12b992172fc4a8a01b665255b577fd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page