Skip to main content

Practical Machine Learning for NLP

Project description

Thinc: Practical Machine Learning for NLP in Python

Thinc is the machine learning library powering spaCy. It features a battle-tested linear model designed for large sparse learning problems, and a flexible neural network model under development for spaCy v2.0.

Thinc is a practical toolkit for implementing models that follow the "Embed, encode, attend, predict" architecture. It's designed to be easy to install, efficient for CPU usage and optimised for NLP and deep learning with text – in particular, hierarchically structured input and variable-length sequences.

🔮 Read the release notes here.

Azure Pipelines Current Release Version PyPi Version conda Version Python wheels

What's where (as of v7.0.0)

Module Description
thinc.v2v.Model Base class.
thinc.v2v Layers transforming vectors to vectors.
thinc.i2v Layers embedding IDs to vectors.
thinc.t2v Layers pooling tensors to vectors.
thinc.t2t Layers transforming tensors to tensors (e.g. CNN, LSTM).
thinc.api Higher-order functions, for building networks. Will be renamed.
thinc.extra Datasets and utilities.
thinc.neural.ops Container classes for mathematical operations. Will be reorganized.
thinc.linear.avgtron Legacy efficient Averaged Perceptron implementation.

Development status

Thinc's deep learning functionality is still under active development: APIs are unstable, and we're not yet ready to provide usage support. However, if you're already quite familiar with neural networks, there's a lot here you might find interesting. Thinc's conceptual model is quite different from TensorFlow's. Thinc also implements some novel features, such as a small DSL for concisely wiring up models, embedding tables that support pre-computation and the hashing trick, dynamic batch sizes, a concatenation-based approach to variable-length sequences, and support for model averaging for the Adam solver (which performs very well).

No computational graph – just higher order functions

The central problem for a neural network implementation is this: during the forward pass, you compute results that will later be useful during the backward pass. How do you keep track of this arbitrary state, while making sure that layers can be cleanly composed?

Most libraries solve this problem by having you declare the forward computations, which are then compiled into a graph somewhere behind the scenes. Thinc doesn't have a "computational graph". Instead, we just use the stack, because we put the state from the forward pass into callbacks.

All nodes in the network have a simple signature:

f(inputs) -> {outputs, f(d_outputs)->d_inputs}

To make this less abstract, here's a ReLu activation, following this signature:

def relu(inputs):
    mask = inputs > 0
    def backprop_relu(d_outputs, optimizer):
        return d_outputs * mask
    return inputs * mask, backprop_relu

When you call the relu function, you get back an output variable, and a callback. This lets you calculate a gradient using the output, and then pass it into the callback to perform the backward pass.

This signature makes it easy to build a complex network out of smaller pieces, using arbitrary higher-order functions you can write yourself. To make this clearer, we need a function for a weights layer. Usually this will be implemented as a class — but let's continue using closures, to keep things concise, and to keep the simplicity of the interface explicit.

The main complication for the weights layer is that we now have a side-effect to manage: we would like to update the weights. There are a few ways to handle this. In Thinc we currently pass a callable into the backward pass. (I'm not convinced this is best.)

import numpy

def create_linear_layer(n_out, n_in):
    W = numpy.zeros((n_out, n_in))
    b = numpy.zeros((n_out, 1))

    def forward(X):
        Y = W @ X + b
        def backward(dY, optimizer):
            dX = W.T @ dY
            dW = numpy.einsum('ik,jk->ij', dY, X)
            db = dY.sum(axis=0)

            optimizer(W, dW)
            optimizer(b, db)

            return dX
        return Y, backward
    return forward

If we call Wb = create_linear_layer(5, 4), the variable Wb will be the forward() function, implemented inside the body of create_linear_layer(). The Wb instance will have access to the W and b variable defined in its outer scope. If we invoke create_linear_layer() again, we get a new instance, with its own internal state.

The Wb instance and the relu function have exactly the same signature. This makes it easy to write higher order functions to compose them. The most obvious thing to do is chain them together:

def chain(*layers):
    def forward(X):
        backprops = []
        Y = X
        for layer in layers:
            Y, backprop = layer(Y)
            backprops.append(backprop)
        def backward(dY, optimizer):
            for backprop in reversed(backprops):
                dY = backprop(dY, optimizer)
            return dY
        return Y, backward
    return forward

We could now chain our linear layer together with the relu activation, to create a simple feed-forward network:

Wb1 = create_linear_layer(10, 5)
Wb2 = create_linear_layer(3, 10)

model = chain(Wb1, relu, Wb2)

X = numpy.random.uniform(size=(5, 4))

y, bp_y = model(X)

dY = y - truth
dX = bp_y(dY, optimizer)

This conceptual model makes Thinc very flexible. The trade-off is that Thinc is less convenient and efficient at workloads that fit exactly into what TensorFlow etc. are designed for. If your graph really is static, and your inputs are homogenous in size and shape, Keras will likely be faster and simpler. But if you want to pass normal Python objects through your network, or handle sequences and recursions of arbitrary length or complexity, you might find Thinc's design a better fit for your problem.

Quickstart

Thinc should install cleanly with both pip and conda, for Pythons 2.7+ and 3.5+, on Linux, macOS / OSX and Windows. Its only system dependency is a compiler tool-chain (e.g. build-essential) and the Python development headers (e.g. python-dev).

pip install -U pip setuptools wheel
pip install thinc

For GPU support, we're grateful to use the work of Chainer's cupy module, which provides a numpy-compatible interface for GPU arrays. However, installing Chainer when no GPU is available currently causes an error. We therefore do not list cupy as an explicit dependency — so cupy is installed using an extra option with the correct CUDA version:

pip install thinc[cuda102]

Alternatively, to install cupy from source use thinc[cuda] or install cupy directly using its source package with pip install cupy.

The rest of this section describes how to build Thinc from source. If you have Fabric installed, you can use the shortcut:

git clone https://github.com/explosion/thinc
cd thinc
fab clean env make test

You can then run the examples as follows:

fab eg.mnist
fab eg.basic_tagger
fab eg.cnn_tagger

Otherwise, you can build and test explicitly with:

git clone https://github.com/explosion/thinc
cd thinc

virtualenv .env
source .env/bin/activate

pip install -r requirements.txt
python setup.py build_ext --inplace
py.test thinc/

And then run the examples as follows:

python examples/mnist.py
python examples/basic_tagger.py
python examples/cnn_tagger.py

Usage

The Neural Network API is still subject to change, even within minor versions. You can get a feel for the current API by checking out the examples. Here are a few quick highlights.

1. Shape inference

Models can be created with some dimensions unspecified. Missing dimensions are inferred when pre-trained weights are loaded or when training begins. This eliminates a common source of programmer error:

# Invalid network — shape mismatch
model = chain(ReLu(512, 748), ReLu(512, 784), Softmax(10))

# Leave the dimensions unspecified, and you can't be wrong.
model = chain(ReLu(512), ReLu(512), Softmax())

2. Operator overloading

The Model.define_operators() classmethod allows you to bind arbitrary binary functions to Python operators, for use in any Model instance. The method can (and should) be used as a context-manager, so that the overloading is limited to the immediate block. This allows concise and expressive model definition:

with Model.define_operators({'>>': chain}):
    model = ReLu(512) >> ReLu(512) >> Softmax()

The overloading is cleaned up at the end of the block. A fairly arbitrary zoo of functions are currently implemented. Some of the most useful:

  • chain(model1, model2): Compose two models f(x) and g(x) into a single model computing g(f(x)).
  • clone(model1, int): Create n copies of a model, each with distinct weights, and chain them together.
  • concatenate(model1, model2): Given two models with output dimensions (n,) and (m,), construct a model with output dimensions (m+n,).
  • add(model1, model2): add(f(x), g(x)) = f(x)+g(x)
  • make_tuple(model1, model2): Construct tuples of the outputs of two models, at the batch level. The backward pass expects to receive a tuple of gradients, which are routed through the appropriate model, and summed.

Putting these things together, here's the sort of tagging model that Thinc is designed to make easy.

with Model.define_operators({'>>': chain, '**': clone, '|': concatenate}):
    model = (
        add_eol_markers('EOL')
        >> flatten
        >> memoize(
            CharLSTM(char_width)
            | (normalize >> str2int >> Embed(word_width)))
        >> ExtractWindow(nW=2)
        >> BatchNorm(ReLu(hidden_width)) ** 3
        >> Softmax()
    )

Not all of these pieces are implemented yet, but hopefully this shows where we're going. The memoize function will be particularly important: in any batch of text, the common words will be very common. It's therefore important to evaluate models such as the CharLSTM once per word type per minibatch, rather than once per token.

3. Callback-based backpropagation

Most neural network libraries use a computational graph abstraction. This takes the execution away from you, so that gradients can be computed automatically. Thinc follows a style more like the autograd library, but with larger operations. Usage is as follows:

def explicit_sgd_update(X, y):
    sgd = lambda weights, gradient: weights - gradient * 0.001
    yh, finish_update = model.begin_update(X, drop=0.2)
    finish_update(y-yh, sgd)

Separating the backpropagation into three parts like this has many advantages. The interface to all models is completely uniform — there is no distinction between the top-level model you use as a predictor and the internal models for the layers. We also make concurrency simple, by making the begin_update() step a pure function, and separating the accumulation of the gradient from the action of the optimizer.

4. Class annotations

To keep the class hierarchy shallow, Thinc uses class decorators to reuse code for layer definitions. Specifically, the following decorators are available:

  • describe.attributes(): Allows attributes to be specified by keyword argument. Used especially for dimensions and parameters.
  • describe.on_init(): Allows callbacks to be specified, which will be called at the end of the __init__.py.
  • describe.on_data(): Allows callbacks to be specified, which will be called on Model.begin_training().

🛠 Changelog

Version Date Description
v7.3.1 2019-10-30 Relax dependecy requirements
v7.3.0 2019-10-28 Mish activation and experimental optimizers
v7.2.0 2019-10-20 Simpler GPU install and bug fixes
v7.1.1 2019-09-10 Support preshed v3.0.0
v7.1.0 2019-08-23 Support other CPUs, read-only arrays
v7.0.8 2019-07-11 Fix version for PyPi
v7.0.7 2019-07-11 Avoid allocating a negative shape for ngrams
v7.0.6 2019-07-11 Fix LinearModel regression
v7.0.5 2019-07-10 Bug fixes for pickle, threading, unflatten and consistency
v7.0.4 2019-03-19 Don't require thinc_gpu_ops
v7.0.3 2019-03-15 Fix pruning in beam search
v7.0.2 2019-02-23 Fix regression in linear model class
v7.0.1 2019-02-16 Fix import errors
v7.0.0 2019-02-15 Overhaul package dependencies
v6.12.1 2018-11-30 Fix msgpack pin
v6.12.0 2018-10-15 Wheels and separate GPU ops
v6.10.3 2018-07-21 Python 3.7 support and dependency updates
v6.11.2 2018-05-21 Improve GPU installation
v6.11.1 2018-05-20 Support direct linkage to BLAS libraries
v6.11.0 2018-03-16 n/a
v6.10.2 2017-12-06 Efficiency improvements and bug fixes
v6.10.1 2017-11-15 Fix GPU install and minor memory leak
v6.10.0 2017-10-28 CPU efficiency improvements, refactoring
v6.9.0 2017-10-03 Reorganize layers, bug fix to Layer Normalization
v6.8.2 2017-09-26 Fix packaging of gpu_ops
v6.8.1 2017-08-23 Fix Windows support
v6.8.0 2017-07-25 SELU layer, attention, improved GPU/CPU compatibility
v6.7.3 2017-06-05 Fix convolution on GPU
v6.7.2 2017-06-02 Bug fixes to serialization
v6.7.1 2017-06-02 Improve serialization
v6.7.0 2017-06-01 Fixes to serialization, hash embeddings and flatten ops
v6.6.0 2017-05-14 Improved GPU usage and examples
v6.5.2 2017-03-20 n/a
v6.5.1 2017-03-20 Improved linear class and Windows fix
v6.5.0 2017-03-11 Supervised similarity, fancier embedding and improvements to linear model
v6.4.0 2017-02-15 n/a
v6.3.0 2017-01-25 Efficiency improvements, argument checking and error messaging
v6.2.0 2017-01-15 Improve API and introduce overloaded operators
v6.1.3 2017-01-10 More neural network functions and training continuation
v6.1.2 2017-01-09 n/a
v6.1.1 2017-01-09 n/a
v6.1.0 2017-01-09 n/a
v6.0.0 2016-12-31 Add thinc.neural for NLP-oriented deep learning

Project details


Release history Release notifications | RSS feed

This version

7.4.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

thinc-7.4.2.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

thinc-7.4.2-cp39-cp39-win_amd64.whl (2.0 MB view details)

Uploaded CPython 3.9 Windows x86-64

thinc-7.4.2-cp39-cp39-manylinux2014_x86_64.whl (2.2 MB view details)

Uploaded CPython 3.9

thinc-7.4.2-cp39-cp39-macosx_10_9_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

thinc-7.4.2-cp38-cp38-win_amd64.whl (2.0 MB view details)

Uploaded CPython 3.8 Windows x86-64

thinc-7.4.2-cp38-cp38-manylinux2014_x86_64.whl (2.2 MB view details)

Uploaded CPython 3.8

thinc-7.4.2-cp38-cp38-macosx_10_9_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

thinc-7.4.2-cp37-cp37m-win_amd64.whl (2.0 MB view details)

Uploaded CPython 3.7m Windows x86-64

thinc-7.4.2-cp37-cp37m-manylinux2014_x86_64.whl (2.2 MB view details)

Uploaded CPython 3.7m

thinc-7.4.2-cp37-cp37m-macosx_10_9_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

thinc-7.4.2-cp36-cp36m-win_amd64.whl (2.0 MB view details)

Uploaded CPython 3.6m Windows x86-64

thinc-7.4.2-cp36-cp36m-manylinux2014_x86_64.whl (2.2 MB view details)

Uploaded CPython 3.6m

thinc-7.4.2-cp36-cp36m-macosx_10_9_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file thinc-7.4.2.tar.gz.

File metadata

  • Download URL: thinc-7.4.2.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2.tar.gz
Algorithm Hash digest
SHA256 772f1a27b9b31e51003d1d2a7476cc49cc81044dd87088112237f93bd2091f0b
MD5 8fbe34c7e04529e8d9c3c5fe9d811f9b
BLAKE2b-256 74bced9f5688f84435d1c1c4fe554ff1f84d634c6cf1c32b6e72e76e63c5c133

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 ce4e4e39ea19719a57b964fd71151547e97cc2bd02e3a7fbfa6f77ec7cfc37b0
MD5 1bba4b9404899b19c63c69febe431c6e
BLAKE2b-256 519b43cfa0bc93fc99e4ebb284350fa5d34870014a35c886ae423e1a31e061ce

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp39-cp39-manylinux2014_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp39-cp39-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: CPython 3.9
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp39-cp39-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4b16ae1a8dee1ca48717776ad86d8d7e01deb20b4f6f3b23c360717577cae3aa
MD5 c27e37e78c150222fea71f94d956cb38
BLAKE2b-256 e4df75b2f5d93eadc009086303ff9a1fa03fde7269fbf41e3dc9b0804537e086

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp39-cp39-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: CPython 3.9, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 5b04efe46cc04bbdf03eefb0bc54db142d0d3c1a071bcad8ec565f8a829eb38e
MD5 74368530e82363860d91a0e8bc046a27
BLAKE2b-256 9132200c2792cd0573ba213ff0bb3ea19deddd0a7fa42f286b803247156d0242

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 163677dc9208ad59be63a6eef242d9000bc845b76b0fac979a5e967b8e88f2da
MD5 2c70216132e97b57e75faa2a2f30ec6c
BLAKE2b-256 37adae7074af4bebc9b39e29536ad99fcd06d5fa1b8d4c1dd8e044e043aed660

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp38-cp38-manylinux2014_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp38-cp38-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp38-cp38-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 45b53d2cf5bd0130db85652fcdf9aeb27c929cc87f13b90039934f22392dd30d
MD5 08d70a8dde2a3737af3afb9a28776808
BLAKE2b-256 5da3cc1c6ae8548d3975d63c3a484577449ebe32a56ae18e291c2e313feaac0b

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 25b6e490b4ec6b7e281bed22daa2eb29611f87bd911d6019c51c62d0750c5d94
MD5 7466c4afadd7bcef08cdbc466a28a613
BLAKE2b-256 3e40c16ebb0ca335689894a82768c86158b47d2feee8016a9dc32e44a9e6753e

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 064b85ce54b974dacfbc674fc8818bec0e3ed83159d47958921a7184a326fc7a
MD5 49cedf8e224baa8e504c26acc21d748f
BLAKE2b-256 a0a50527b4a98809276640c125478d5f439966a5c456bf2fee0b39ac8ac108ba

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp37-cp37m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp37-cp37m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 dc7f447b51e8895a4cf0868d00dcf1c593cfade39b781300f47c48df09cf651d
MD5 5c77428376b164dc14f08af5089392f1
BLAKE2b-256 100c7500b548fa27554571d6f5c9723789f57d4b1f247f95759ca277d3a6ee0d

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 7e29d637492a78aa078fb8509f9ef78a3b1c735899cc0f72b5982549306db47e
MD5 c9a34165b902451058f884a5d2f7d3bd
BLAKE2b-256 3a2d2adc8219344405a8e712763df7d848c6e931c080e227f894ff908c253248

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 e8049735cfbbf2b30fbc1280cc4da30d2207b99cff884b545fbd6cd4c39ff9d8
MD5 873c9e91da2fedb5a3dea0eccc87f1ae
BLAKE2b-256 554fd99bbd3b9973c03f10b4ba52477d97412a2620d4148b54cbf1f3b2bef670

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp36-cp36m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp36-cp36m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 2.2 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp36-cp36m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5d7238b48cbd053f5356259b20a5162a21efdc6417bf96ce8aa1b83e2faa8191
MD5 2d1ab13f777eae31827efd3cdbbf65df
BLAKE2b-256 4d2b04671e224b71546f57dca39b526837654428cc4f162ac7d745bd81224d04

See more details on using hashes here.

File details

Details for the file thinc-7.4.2-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: thinc-7.4.2-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.9

File hashes

Hashes for thinc-7.4.2-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 39fb92b4bd7b71b9e624ee1c0302382a7b56a93d586818232a3dc76571d0c5a8
MD5 6d9aae4124a995ef46e437ec268bf131
BLAKE2b-256 f0957d49e5a45a9e0d70eb1887f3a7867b26862fa7704d268df5e60f7283ba59

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page