Skip to main content

SpaCy support to split affixes for Freeling-like affixes rules and dictionaries

Project description

SpaCy Affixes

https://img.shields.io/pypi/v/spacy-affixes.svg https://img.shields.io/travis/linhd-postdata/spacy-affixes.svg Documentation Status

SpaCy support for affixes splitting for Freeling-like affixes rules and dictionaries.

Usage

This library was born to split clitics from verbs so POS tagging works out-of-the-box with spaCy models.

from spacy_affixes import AffixesMatcher
nlp = spacy.load("es")
affixes_matcher = AffixesMatcher(nlp, split_on=["VERB"])
nlp.add_pipe(affixes_matcher, name="affixes", before="tagger")
for token in nlp("Yo mismamente podría hacérselo bien."):
    print(
        token.text,
        token.lemma_,
        token.pos_,
        token.tag_,
        token._.has_affixes,
        token._.affixes_rule,
        token._.affixes_kind,
        token._.affixes_text,
        token._.affixes_length,
    )

The output will be

Hay Hay AUX AUX__Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin False None None None 0
que que SCONJ SCONJ___ False None None None 0
hacér hacer VERB  True suffix_selo suffix hacer 2
se se PRON PRON__Person=3 False None None None 0
lo el PRON PRON__Case=Acc|Gender=Masc|Number=Sing|Person=3|PronType=Prs False None None None 0
todo todo PRON PRON__Gender=Masc|Number=Sing|PronType=Ind False prefix_todo None None 0
, , PUNCT PUNCT__PunctType=Comm False None None None 0
y y CONJ CCONJ___ False None None None 0
rápidamente rápidamente ADV ADV___ False suffix_mente None None 0
además además ADV ADV___ False prefix_a None None 0
. . PUNCT PUNCT__PunctType=Peri False None None None 0

However, words with suffixes could also be split if needed, or virtually any word for which a rule matches, just by passing a list of Universal Dependency POS’s to the argument split_on. Passing in split_on="*" would make AffixesMatcher() try to split on everything it finds.

Rules and Lexicon

Due to licensing issues, spacy-affixes comes with no rules nor lexicons by default. There are two ways of getting data into spacy-affixes:

  1. Create the rules and lexicon yourself with the entities you are interested on, and pass them in using AffixesMatcher(nlp, rules=<rules>, dictionary=<dictionary>). The format for these is as follows.

    • rules: Dictionary of rules for affixes handling. Each dict uses a key that contains the pattern to match and the value is a list of dicts with the corresponding rule parameters:

      • pattern: Regular expression to match, (ex. r"ito$") If a match is found, it gets removed from the token

      • kind: AFFIXES_SUFFIX or AFFIXES_PREFIX

      • pos_re: EAGLE regular expression to match, (ex. r"V")

      • strip_accent: Boolean indicating whether accents should be stripped in order to find the rest of the token in the lexicon

      • affix_add: List of strings to add to the rest of the token to find it in the lexicon. Each element in the list is tried separately, as in an OR condition. The character * means add nothing (ex. ["*", "io"])

      • affix_text: List of Strings with the text to the rest of the token as individual tokens. For example, a rule for dígamelo might have ["me", "lo"] as its affix_text

    • lexicon: Dictionary keyed by word with values for lemma, EAGLE code, UD POS, and UD Tags.

  2. Convert the Freeling data. Take into account that if you use Freeling data you are effectively agreeing to their license, which might have implications in the release if your own code. If installed, spacy-affixes will look for the environment variables FREELINGDIR or FREELINGSHARE to find the affixes rules and dictionary files and will process them. If you don’t have Freeling installed you can always run the download command:

python -m spacy_affixes download <lang> <version>

Where lang is the 2-character ISO 639-1 code for a supported language, and version an tagged version in their GitHub repository.

Notes

  • Some decisions might feel idiosyncratic since the purpose of this library at the beginning was to just split clitics in Spanish texts.

History

0.1.0 (2019-04-02)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spacy_affixes-0.1.4.tar.gz (28.5 kB view details)

Uploaded Source

Built Distribution

spacy_affixes-0.1.4-py2.py3-none-any.whl (15.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file spacy_affixes-0.1.4.tar.gz.

File metadata

  • Download URL: spacy_affixes-0.1.4.tar.gz
  • Upload date:
  • Size: 28.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.8.0

File hashes

Hashes for spacy_affixes-0.1.4.tar.gz
Algorithm Hash digest
SHA256 86691450802800b4541855ed76a59291c8eb3eabba53f1a8860c61a6ee5de8d7
MD5 1243b70c2e51cdef38d62539103a4c1b
BLAKE2b-256 80065ec97af965bea989016639c9540a3b8c1cde34151b9de30405a4c4a1a15b

See more details on using hashes here.

File details

Details for the file spacy_affixes-0.1.4-py2.py3-none-any.whl.

File metadata

  • Download URL: spacy_affixes-0.1.4-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.8.0

File hashes

Hashes for spacy_affixes-0.1.4-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f4a8db2baf7daabbf1fcacdec1949ef73ddf1396798fd0c76d906f4f25db61de
MD5 340af97bb4826cade74eb4f05e48e574
BLAKE2b-256 91e363ea055974c9711d92b75fc26bba509615371b1b31ff84fd7181df81080c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page