Skip to main content

UITNLP: A Python NLP Library for Vietnamese

Project description

UITNLP: A Python NLP Library for Vietnamese

Installation

You can install this package from PyPI using pip:

$ pip install uit-tokenizer

Example

#!/usr/bin/python
# -*- coding: utf-8 -*-
from uit_tokenizer import load_word_segmenter
word_segmenter = load_word_segmenter(feature_name='base_sep_sfx')
word_segmenter.segment(texts=['Chào mừng bạn đến với Trường Đại học Công nghệ Thông tin, ĐHQG-HCM.'], pre_tokenized=False, batch_size=4)

Note

Currently, we have just wrappered the Vietnamese word segmentation method published in the following our paper:

@InProceedings{10.1007/978-981-15-6168-9_33,
  author    = "Nguyen, Duc-Vu and Van Thin, Dang and Van Nguyen, Kiet and Nguyen, Ngan Luu-Thuy",
  editor    = "Nguyen, Le-Minh and Phan, Xuan-Hieu and Hasida, K{\^o}iti and Tojo, Satoshi",
  title     = "Vietnamese Word Segmentation with SVM: Ambiguity Reduction and Suffix Capture",
  booktitle = "Computational Linguistics",
  year      = "2020",
  publisher = "Springer Singapore",
  address   = "Singapore",
  pages     = "400--413",
  abstract  = "In this paper, we approach Vietnamese word segmentation as a binary classification by using the Support Vector Machine classifier. We inherit features from prior works such as n-gram of syllables, n-gram of syllable types, and checking conjunction of adjacent syllables in the dictionary. We propose two novel ways to feature extraction, one to reduce the overlap ambiguity and the other to increase the ability to predict unknown words containing suffixes. Different from UETsegmenter and RDRsegmenter, two state-of-the-art Vietnamese word segmentation methods, we do not employ the longest matching algorithm as an initial processing step or any post-processing technique. According to experimental results on benchmark Vietnamese datasets, our proposed method obtained a better {\$}{\$}{\backslash}text {\{}F{\}}{\_}{\{}1{\}}{\backslash}text {\{}-score{\}}{\$}{\$}F1-scorethan the prior state-of-the-art methods UETsegmenter, and RDRsegmenter.",
  isbn      = "978-981-15-6168-9"
}

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uit_tokenizer-1.0.tar.gz (16.2 kB view hashes)

Uploaded Source

Built Distribution

uit_tokenizer-1.0-py3-none-any.whl (15.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page