Skip to main content

Wrappers for including pre-trained transformers in spaCy pipelines

Project description

spaCy-wrap: For Wrapping fine-tuned transformers in spaCy pipelines

PyPI version python version Code style: black github actions pytest github actions docs github coverage CodeFactor

spaCy-wrap is a minimal library intended for wrapping fine-tuned transformers from the Huggingface model hub in your spaCy pipeline allowing the inclusion of existing models within SpaCy workflows.

As for as possible it follows a similar API as spacy-transformers.

Installation

Installing spacy-wrap is simple using pip:

pip install spacy_wrap

Examples

The following shows a simple example of how you can quickly add a fine-tuned transformer model from the Huggingface model hub for either text classification, named entity or token classification.

Sequence Classification

In this example, we will use a model fine-tuned for sentiment classification on SST2. This model classifies whether a text is positive or negative. We will add this model to a blank English pipeline:

import spacy
import spacy_wrap

nlp = spacy.blank("en")

config = {
    "doc_extension_trf_data": "clf_trf_data",  # document extention for the forward pass
    "doc_extension_prediction": "sentiment",  # document extention for the prediction
    "model": {
        # the model name or path of huggingface model
        "name": "distilbert-base-uncased-finetuned-sst-2-english",  
    },
}

transformer = nlp.add_pipe("sequence_classification_transformer", config=config)

doc = nlp("spaCy is a wonderful tool")

print(doc.cats)
# {'NEGATIVE': 0.001, 'POSITIVE': 0.999}
print(doc._.sentiment)
# 'POSITIVE'
print(doc._.clf_trf_data)
# TransformerData(wordpieces=...

These pipelines can also easily be applied to multiple documents using the nlp.pipe as one would expect from a spaCy component:

docs = nlp.pipe(
    [
        "I hate wrapping my own models",
        "Isn't there a tool for this?!",
        "spacy-wrap is great for wrapping models",
    ]
)

for doc in docs:
    print(doc._.sentiment)
# 'NEGATIVE'
# 'NEGATIVE'
# 'POSITIVE'

More Examples

It is always nice to have more than one example. Here is another one where we add the Hate speech model for Danish to a blank Danish pipeline:

import spacy
import spacy_wrap

nlp = spacy.blank("da")

config = {
    "doc_extension_trf_data": "clf_trf_data",  # document extention for the forward pass
    "doc_extension_prediction": "hate_speech",  # document extention for the prediction
    # choose custom labels
    "labels": ["Not hate Speech", "Hate speech"],
    "model": {
        "name": "DaNLP/da-bert-hatespeech-detection",  # the model name or path of huggingface model
    },
}

transformer = nlp.add_pipe("classification_transformer", config=config)

doc = nlp("Senile gamle idiot") # old senile idiot

doc._.clf_trf_data
# TransformerData(wordpieces=...
doc._.hate_speech
# "Hate speech"
doc._.hate_speech_prob
# {'prob': array([0.013, 0.987], dtype=float32), 'labels': ['Not hate Speech', 'Hate speech']}

Token Classification

We can also use the model for token classification:

import spacy
import spacy_wrap
nlp = spacy.blank("en")

config = {"model": {"name": "vblagoje/bert-english-uncased-finetuned-pos"}}

nlp.add_pipe("token_classification_transformer", config=config)

text = "My name is Wolfgang and I live in Berlin"

doc = nlp(text)
doc._.tok_clf_predictions
# ['O', 'O', 'O', 'B-PER', 'O', 'O', 'O', 'O', 'B-LOC', 'O']

By default, spacy-wrap will automatically detect it the labels follow the universal POS tags as well. If so it will also assign it to the token.pos, similar regular spacy pipelines:

doc[0].pos_
# 'PRON'

Named Entity Recognition

In this example, we use a model fine-tuned for named entity recognition. spacy-wrap will in this case infer from the IOB tags that the model is intended for named entity recognition and assign it to doc.ents.

import spacy
import spacy_wrap
nlp = spacy.blank("en")

# specify model from the hub
config = {"model": {"name": "dslim/bert-base-NER"}}

# add it to the pipe
nlp.add_pipe("token_classification_transformer", config=config)

doc = nlp("My name is Wolfgang and I live in Berlin.")

print(doc.ents)
# (Wolfgang, Berlin)

📖 Documentation

Documentation
🔧 Installation Installation instructions for spacy-wrap.
📰 News and changelog New additions, changes and version history.
🎛 Documentation The reference for spacy-wrap's API.

💬 Where to ask questions

Type
🚨 FAQ FAQ
🚨 Bug Reports GitHub Issue Tracker
🎁 Feature Requests & Ideas GitHub Issue Tracker
👩‍💻 Usage Questions GitHub Discussions
🗯 General Discussion GitHub Discussions

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spacy-wrap-1.2.1.tar.gz (19.7 kB view details)

Uploaded Source

Built Distribution

spacy_wrap-1.2.1-py2.py3-none-any.whl (24.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file spacy-wrap-1.2.1.tar.gz.

File metadata

  • Download URL: spacy-wrap-1.2.1.tar.gz
  • Upload date:
  • Size: 19.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for spacy-wrap-1.2.1.tar.gz
Algorithm Hash digest
SHA256 fb58f0e5e8af402a274685f4fd214c8b5a236e7ed8a2d23d1303ee691e1dc3f5
MD5 3dc4f4c1e44ce73505ee4209b78ea4bd
BLAKE2b-256 9a834c79d27c66971381810a9c2869a0059eca25cecb6643596e31d1973541f4

See more details on using hashes here.

File details

Details for the file spacy_wrap-1.2.1-py2.py3-none-any.whl.

File metadata

  • Download URL: spacy_wrap-1.2.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 24.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for spacy_wrap-1.2.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 368a7c5dc8f49d99b672bd65e9a6931db94a2fedd4874f38400268479e359182
MD5 86ce19f57d010ae8c71dda877ab94969
BLAKE2b-256 f6d053f1d98992a4d4cf453e60a4fdef2f64d2a923272303203c120b98a277b0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page