Skip to main content

COMBO-NLP - A library for Morphosyntactic Tagging and Dependency Parsing.

Project description

COMBO-NLP

A library for Morphosyntactic Tagging and Dependency Parsing based on Universal Dependencies.

Installation

pip install combo-nlp

LAMBO segmenter (optional)

A segmenter is only needed when passing raw text strings to COMBO. If you provide pre-tokenized input (list[str] or list[list[str]]), no segmenter is required.

When you initialize COMBO with a language name (e.g. COMBO("Polish")), it automatically loads a LAMBO segmenter. If LAMBO is not installed, an ImportError is raised. LAMBO is hosted on a custom PyPI index and must be installed separately:

pip install --index-url https://pypi.clarin-pl.eu/ lambo

Usage

Full text input

from combo import COMBO

# Load by HuggingFace model ID:
nlp = COMBO.from_pretrained("clarin-pl/combo-nlp-xlm-roberta-base-polish-pdb-ud2.17")
result = nlp("Ala ma kota.")

# Or load by language name (with Lambo segmenter):
nlp = COMBO("Polish")
result = nlp("Ala ma kota.")

# Or use the Language enum:
from combo import Language
nlp = COMBO(Language.POLISH)
result = nlp("Ala ma kota.")

# Multiple sentences:
result = nlp(["Ala ma kota.", "Pies je."])

# Access results:
for sentence in result:
    for token in sentence:
        print(token.form, token.upos, token.head, token.deprel, token.lemma)

Pre-tokenized input

from combo import COMBO

nlp = COMBO.from_pretrained("clarin-pl/combo-nlp-xlm-roberta-base-polish-pdb-ud2.17")

# Single sentence:
result = nlp(["Ala", "ma", "kota", "."], tokenized=True)

# Multiple sentences:
result = nlp([["Ala", "ma", "kota", "."], ["Pies", "je", "."]], tokenized=True)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

combo_nlp-4.0.4.tar.gz (85.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

combo_nlp-4.0.4-py3-none-any.whl (95.5 kB view details)

Uploaded Python 3

File details

Details for the file combo_nlp-4.0.4.tar.gz.

File metadata

  • Download URL: combo_nlp-4.0.4.tar.gz
  • Upload date:
  • Size: 85.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for combo_nlp-4.0.4.tar.gz
Algorithm Hash digest
SHA256 737ad8504d1f5bdafd656237d779c7aab563583802e99e3c5956c8196f46462d
MD5 8fab2b0311ff3df0421ff740e413bc9f
BLAKE2b-256 edccde59cf75f928b24af08743292d3d5c1dd3a8dadb56c5904db67837e2f713

See more details on using hashes here.

File details

Details for the file combo_nlp-4.0.4-py3-none-any.whl.

File metadata

  • Download URL: combo_nlp-4.0.4-py3-none-any.whl
  • Upload date:
  • Size: 95.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for combo_nlp-4.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d51801227cd215a068e71c505130bbeff9ae022c4a28836d1159f1f08a34a3b0
MD5 51a34a147b50f296e80c1af1a044865c
BLAKE2b-256 33548a26ee75729e5608732637abebd7f803f4aeee666dc90b84699a84600694

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page