Skip to main content

Fully customizable language detection for spaCy pipeline

Project description

Here is spacy_language_detection

Spacy_language_detection is a fully customizable language detection for spaCy pipeline forked from spacy-langdetect in order to fix the seed problem (see this issue) and to update it with spaCy 3.0.

Use spacy_language_detection to

  • Detect the language of a document,
  • Detect the language of the sentences of a document.

Installation

pip install spacy-language-detection

Basic Usage

Out of the box, under the hood, it uses langdetect to detect languages on spaCy's Doc and Span objects.

Here is how to use it for spaCy 3.0 see here for an example with spaCy 2.0.

import spacy
from spacy.language import Language

from spacy_language_detection import LanguageDetector


def get_lang_detector(nlp, name):
    return LanguageDetector(seed=42)  # We use the seed 42


nlp_model = spacy.load("en_core_web_sm")
Language.factory("language_detector", func=get_lang_detector)
nlp_model.add_pipe('language_detector', last=True)

# Document level language detection
job_title = "Senior NLP Research Engineer"
doc = nlp_model(job_title)
language = doc._.language
print(language)

# Sentence level language detection
text = "This is English text. Er lebt mit seinen Eltern und seiner Schwester in Berlin. Yo me divierto todos los días en el parque. Je m'appelle Angélica Summer, j'ai 12 ans et je suis canadienne."
doc = nlp_model(text)
for i, sent in enumerate(doc.sents):
    print(sent, sent._.language)

Using your own language detector

Suppose you are not happy with the accuracy of the out-of-the-box language detector, or you have your own language detector, which you want to use with a spaCy pipeline. How do you do it? That's where the language_detection_function argument comes in. The function takes in a spaCy Doc or Span object and can return any Python object which is stored in doc._.language and span._.language. For example, let's say you want to use googletrans as your language detection module:

import spacy
from spacy.tokens import Doc, Span
from spacy_language_detection import LanguageDetector
# install using pip install googletrans
from googletrans import Translator

nlp = spacy.load("en")


def custom_detection_function(spacy_object):
    # Custom detection function should take a spaCy Doc or a Span
    assert isinstance(spacy_object, Doc) or isinstance(
        spacy_object, Span), "spacy_object must be a spacy Doc or Span object but it is a {}".format(type(spacy_object))
    detection = Translator().detect(spacy_object.text)
    return {'language': detection.lang, 'score': detection.confidence}


def get_lang_detector(nlp, name):
    return LanguageDetector(language_detection_function=custom_detection_function, seed=42)  # We use the seed 42


nlp_model = spacy.load("en_core_web_sm")
Language.factory("language_detector", func=get_lang_detector)
nlp_model.add_pipe('language_detector', last=True)

text = "This is English text. Er lebt mit seinen Eltern und seiner Schwester in Berlin. Yo me divierto todos los días en el parque. Je m'appelle Angélica Summer, j'ai 12 ans et je suis canadienne."

# Document level language detection
doc = nlp_model(text)
language = doc._.language
print(language)

# Sentence level language detection
text = "This is English text. Er lebt mit seinen Eltern und seiner Schwester in Berlin. Yo me divierto todos los días en el parque. Je m'appelle Angélica Summer, j'ai 12 ans et je suis canadienne."
doc = nlp_model(text)
for i, sent in enumerate(doc.sents):
    print(sent, sent._.language)

Similarly, you can also use pycld2 and other language detectors with spaCy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spacy-language-detection-0.2.1.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file spacy-language-detection-0.2.1.tar.gz.

File metadata

  • Download URL: spacy-language-detection-0.2.1.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.1 setuptools/47.3.2 requests-toolbelt/0.9.1 tqdm/4.52.0 CPython/3.8.5

File hashes

Hashes for spacy-language-detection-0.2.1.tar.gz
Algorithm Hash digest
SHA256 eb6f4aefbd292b0081b62adbbd7727caf46aae49a2b4d1f652bf7f065e241a72
MD5 40c83f73ea02a0b22a2fc41a0e7be3ee
BLAKE2b-256 6048f86fad67d6afc03cbfb50eb872fc401698108fc687cfdccfbfe562e79661

See more details on using hashes here.

File details

Details for the file spacy_language_detection-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: spacy_language_detection-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.1 setuptools/47.3.2 requests-toolbelt/0.9.1 tqdm/4.52.0 CPython/3.8.5

File hashes

Hashes for spacy_language_detection-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7868a2f2853e937642b83268d108d17b3f5c885df376950cfc63c2391304bdd4
MD5 63ddce508ad066c30caf92c1e80893b9
BLAKE2b-256 c87c6c07435d220e09c1dab1d0c0c0f51d397e8c42212567ace34dd8d7188e9f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page