Skip to main content

Paradigm learning and paradigm prediction

Project description

NB!

This is Språkbanken's inofficial version of the paradigmextract library. The main version can be found here.

---*--- ---*--- ---*---

Paradigm learning and paradigm prediction

The software collection in this repository is related to a body of scientific work on paradigm learning and paradigm prediction, of which the following publication is the latest one. See the reference list for previous work.

[Forsberg, M; Hulden, M. (2016). Learning Transducer Models for Morphological Analysis from Example Inflections. In Proceedings of StatFSM. Association for Computational Linguistics.] (http://anthology.aclweb.org/W16-2405)

Quick reference

Paradigm learning: pextract.py

Description

Extract paradigmatic representations from input inflection tables. See Section 2 in Forsberg and Hulden (2016) for details.

Example

$ python src/pextract.py < data/es_verb_train.txt > es_verb.p

Non-probabilistic morphological analyzers: morphanalyzer.py

Description

Create a foma-compatible morphological analyzer from a paradigm file. The analyzer is non-probabilistic.

Options:

  • -o recreate original data (all vars must be exactly instantiated as seen in training data)
  • -c constrain variables by generalizing (default pvalue = 0.05)
  • -u unconstrained (all variables are defined as ?+)
  • -p use together with -c
  • -s keep different analyzers separate instead of merging with priority union (may be necessary for some analyzers)
  • -n name of binary foma file to compile to

Any combination of the above may be used. The analyzers are combined by priority union, e.g. -o -c -u would yield an analyzer [ Goriginal .P. Gconstrained .P. Gunconstrained ].

Example

$ python src/morphanalyzer.py -o -c es_verb.p > es_verb.foma

Probabilistic morphological analyzers: morphparser.py

Description

Create a probabilistic morphological analyzer from a paradigm file.

Reads one or more whitespace-separated words from STDIN and returns the most plausible analysis for the set in the format: SCORE NAME_OF_PARADIGM VARIABLES WORDFORM1:BASEFORM,MSD#WORDFORM2:BASEFORM,MSD...

Flags:

  • -k num print the k best analyses
  • -t print the entire table for the best analysis
  • -d print debug info
  • -n num use an nth order ngram model for selecting best paradigm (an n-gram model for variables in the paradigm is used)

Example

$ echo "coger cojo" | python morphparser.py ./../paradigms/spanish_verbs.p -k 1 -t

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

paradigmextract-0.1.1.tar.gz (391.1 kB view details)

Uploaded Source

Built Distribution

paradigmextract-0.1.1-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file paradigmextract-0.1.1.tar.gz.

File metadata

  • Download URL: paradigmextract-0.1.1.tar.gz
  • Upload date:
  • Size: 391.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.1.2 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.7

File hashes

Hashes for paradigmextract-0.1.1.tar.gz
Algorithm Hash digest
SHA256 63764fee55adf48abb68f9956a22ecd4a1257952b99b1592bcf3ae064ed7cc0c
MD5 53d21a9419f4562e677eece6d6f276eb
BLAKE2b-256 574ae51567471941270edc9d34204365484278c6474ac3b7747ecb4fefe58900

See more details on using hashes here.

File details

Details for the file paradigmextract-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: paradigmextract-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.1.2 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.7

File hashes

Hashes for paradigmextract-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f001ca9f147905d5a9abce158c6923005fcc5af108134858b74cfa322dd130f4
MD5 95179ae84bf40804bb4f9c6e8c3b373d
BLAKE2b-256 4f44dfe191146b3a4ed74a83a145cd4e0f6646cf7f83e03442978f4b45040ebd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page