Skip to main content

g2pC: A Context-aware g2p module for Chinese

Project description

image image image

g2pC: A Context-aware Grapheme-to-Phoneme for Chinese

There are several open source libraries of Chinese grapheme-to-phoneme conversion such as python-pinyin or xpinyin. However, none of them seem to disambiguate Chinese polyphonic words like "行" ("xíng" (go, walk) vs. "háng" (line)) or "了" ("le" (completed action marker) vs. "liǎo" (finish, achieve)). Instead, they pick up the most frequent pronunciation. Although that may be a simple and economic strategy, machine learning techniques can be of help. We use CRF to determine the pronunciation of polyphonic words. In addition to the target word itself and its part-of-speech, which are tagged by pkuseg, its neighboring words are also featurized.

Requirements

  • python >= 3.6
  • pkuseg
  • sklearn_crfsuite

Installation

pip install g2pc

Main Features

  • Disambiguate polyphonic Chinese characters/words and return the most likely pinyin in the context using CRF implemented with sklearn_crfsuite.
  • By associating segmentation results provided by pkuseg with an open-source dictionary CC-CEDICT, display the following comprehensive information.
    • word
    • part-of-speech
    • pinyin
    • descriptive pinyin: where Chinese tone change rules are applied
    • English meaning
    • traditional equivalent

Algorithm (illustrated with an example)

e.g., Input: 我写了几行代码。 (I wrote a few lines of codes.)

  • STEP 1. Segment input string using pkuseg.

    • -> [('我', 'r'), ('写', 'v'), ('了', 'u'), ('几', 'm'), ('行', 'q'), ('代码', 'n'), ('。', 'w')]
  • STEP 2. Look up the CC-CEDICT. Each token, a tuple, consists of word, pos, pronunciation candidates, meaning candidates, traditional character candidates.

    • -> [('我', 'r', ['wo3'], ['/I/me/my/'], ['我']),
      ('写', 'v', ['xie3'], ['/to write/'], ['寫']),
      ('了', 'u', ['le5', 'liao3', 'liao4'], [dal particle ..], ['了', '了', '瞭']),
      ('几', 'm', ['ji3', 'ji1'], ['/how much/..'], ['幾', '几']),
      ('行', 'q', ['xing2', 'hang2'], ['/to walk/.."], ['行', '行']),
      ('代码', 'n', ['dai4 ma3'], ['/code/'], ['代碼']),
      ('。', 'w', ['。'], [''], ['。'])]
  • STEP 3. For polyphonic words, we disambiguate them, using our pre-trained CRF model.

    • -> [('我', 'r', 'wo3', '/I/me/my/', '我'),
      ('写', 'v', 'xie3', '/to write/', '寫'),
      ('了', 'u', 'le5', '/(modal particle ..', '了'),
      ('几', 'm', 'ji3', '/how much/..', '幾'),
      ('行', 'q', 'hang2', "/row/..", '行'),
      ('代码', 'n', 'dai4 ma3', '/code/', '代碼'),
      ('。', 'w', '。', '。', '', '。')]
  • STEP 4. Tone change rules are applied.

    • -> [('我', 'r', 'wo3', 'wo2', '/I/me/my/', '我'),
      ('写', 'v', 'xie3', 'xie3', '/to write/', '寫'),
      ('了', 'u', 'le5', 'le5', '/(modal particle ..', '了'),
      ('几', 'm', 'ji3', 'ji3', '/how much/..', '幾'),
      ('行', 'q', 'hang2', 'hang2, "/row/..", '行'),
      ('代码', 'n', 'dai4 ma3', 'dai4 ma3', '/code/', '代碼'),
      ('。', 'w', '。', '。', '', '。')]

Usage

>>> from g2pc import G2pC
>>> g2p = G2pC()
>>> g2p("一心一意")
# This returns a list of tuples, each of which consists of
# word, pos, pinyin, (tone changed) descriptive pinyin, English meaning, and equivanlent traditional character.
[[('一心一意', 
'i', 
'yi1 xin1 yi1 yi4', 
'yi4 xin1 yi2 yi4', 
"/concentrating one's thoughts and efforts/single-minded/bent on/intently/", 
'一心一意')]

Respectful comparison with other libraries

>>> text1 = "我写了几行代码。" # pay attention to the 行, which should be read as 'hang2', not 'xing2'
>>> text2 = "来不了" # pay attention to the 了, which should be read as 'liao3', not 'le'
# python-pinyin
>>> pip install pypinyin
>>> from pypinyin import pinyin
>>> pinyin(text1)
[['wǒ'], ['xiě'], ['le'], ['jǐ'], ['xíng'], ['dài'], ['mǎ'], ['。']]
>>> pinyin(text2)
[['lái'], ['bù'], ['le']]
# xpinyin
>>> pip install xpinyin
>>> from xpinyin import Pinyin
>>> p = Pinyin()
>>> p.get_pinyin(text1, tone_marks="numbers")  
'wo3-xie3-le5-ji1-xing2-dai4-ma3-。'
>>> p.get_pinyin(text2, tone_marks="numbers")   
'lai2-bu4-le5'

Changelog

0.9.9.3 July 10, 2019

  • Refined the tone change rules.

0.9.9.2 July 10, 2019

  • Refined the cedict.pkl.

0.9.9.1 July 9, 2019

  • Fixed a bug of failing to find Chinese characters for names. (See this)

0.9.6. July 7, 2019

  • Fixed a bug of failing to converting words not found in the dictionary.
  • Rearragned the cedict.pkl.
  • Refined the CRF model.
  • Added tone change rules. (See this)

0.9.4. July 4, 2019

  • Initial launch

References

If you use our software for research, please cite:

@misc{gp2C2019,
  author = {Park, Kyubyong},
  title = {g2pC},
  year = {2019},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/Kyubyong/g2pC}}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

g2pC-0.9.9.3.tar.gz (7.7 MB view details)

Uploaded Source

Built Distribution

g2pC-0.9.9.3-py3-none-any.whl (7.7 MB view details)

Uploaded Python 3

File details

Details for the file g2pC-0.9.9.3.tar.gz.

File metadata

  • Download URL: g2pC-0.9.9.3.tar.gz
  • Upload date:
  • Size: 7.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.1

File hashes

Hashes for g2pC-0.9.9.3.tar.gz
Algorithm Hash digest
SHA256 4695d169512a24a16516658dbeb161af756f93ebdbc23ae5d7071ea5d8202ea4
MD5 50ba313602a2f0ea9ae6a8346a00cad3
BLAKE2b-256 8384b5a1d5300ffbb491506581db72f4cd872b9027a5fca72bd7a2ed660de117

See more details on using hashes here.

File details

Details for the file g2pC-0.9.9.3-py3-none-any.whl.

File metadata

  • Download URL: g2pC-0.9.9.3-py3-none-any.whl
  • Upload date:
  • Size: 7.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.1

File hashes

Hashes for g2pC-0.9.9.3-py3-none-any.whl
Algorithm Hash digest
SHA256 395f9237b4c3d0cace7fca63f9b3693e5071d7e894cf7e0ea2a38d6f8017b30a
MD5 dd22d3f0f7dac9919bcbf40b7cd6556a
BLAKE2b-256 91e4d825a92b0bc684b03554969cbe39584a744b2b0cf3e28ebd89076b82d496

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page