Skip to main content

'BigramSplitter' is add-on search product for Plone 3.x. It supports non-English languages, especially south east Asian languages.

Project description

Introduction

Specification: Text character normalization process uses Python unicodedata. Convert full-width numeric and alphabet character into half-width equivalent. Convert half-width Katakana into full-width equivalent. Therefore all of above character variations can be recognized as same ones.

Language Specifications:

  • Chinese

  • No space between words.

  • There is only Kanji(Chinese) character

  • Process with Bigram(2-gram) model

  • Japanese

  • No space between words

  • Combination 0f Kanji(Chinese), Katakana, and Hiragana character

  • Korean

  • There are spaces between words, but it contains a particle

  • Combination of Korean alphabet and Kanji(Chinese) character

  • Discriminate Korean alphabet and Kanji(Chinese) character and processed with Bigram(2-gram) model

  • Thai

  • No space between words

  • It’s very difficult to handle this language in a computer

  • A vowel and a consonant are registered in Unicode separately so that it is difficult to recognize as one word.

  • However, there is a possibility of dealing with Thai characters to use Bigram(2-gram) model.

  • Other languages (Including English)

  • There is a space between words

  • It is indexed each word

Notes:

  • Source Code

    Since no documents are available on how to develop ‘word splitter’, we refer to other splitter source code. But I still have a number of questions. If you have any more information, please feel free let us know.

  • Hotfix to Plone 3.0 source code

    Because Plone 3.x catalog setting, catalog.xml, doesn’t have existing index overwrite mechanism, we developed hotfix and added XML attribute. We believe Plone 3 XML define mechanism is simple and clear, so that we take this approach. We appreciate any comment.

Installation

Use zc.buildout

  • Add Products.BigramSplitter to the list of eggs to install, e.g.:

    [buildout]
    ...
    eggs =
        ...
        Products.BigramSplitter
  • Tell the plone.recipe.zope2instance recipe to install a ZCML slug:

    [instance]
    recipe = plone.recipe.zope2instance
    ...
    zcml =
        Products.BigramSplitter
  • Re-run buildout, e.g. with:

    $ ./bin/buildout
  • Restart Zope

  • Plone setting – Add on products – Quick install

Old Style

  • Untar downloaded file, then copy to ‘Products’ directory of your Plone instance.

  • Restart Zope

  • Plone setting – Add on products – Quick install

Required

  • Plone3.0.x or higher

License

  • See docs/LICENSE.txt

Author

  • Manabu Terada e-mail : terada@cmscom.jp

  • Mikio Hokari

  • Naoki Nakanishi

  • Naotaka Hotta

  • Takashi Nagai

To Do

  • Add re-install mechanism

  • Supports more languages

Changelog

1.0a2 (2010-01-29)

  • Fixed full width space for and search

1.0a1 (2009-12-05)

  • Initial release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Products.BigramSplitter-1.0b3.tar.gz (28.4 kB view details)

Uploaded Source

Built Distribution

Products.BigramSplitter-1.0b3-py2.4.egg (38.5 kB view details)

Uploaded Egg

File details

Details for the file Products.BigramSplitter-1.0b3.tar.gz.

File metadata

File hashes

Hashes for Products.BigramSplitter-1.0b3.tar.gz
Algorithm Hash digest
SHA256 6b04676820c21fd468dd9585f34bc5ad5534e6185598ddc3a9892ec98c309815
MD5 ca11722b51cd7cb5ec46c14e5a045c54
BLAKE2b-256 317a54d5f103ba1782a2eb0d0f5efed26a5b10dc394787f7c296962b3d85d3d2

See more details on using hashes here.

File details

Details for the file Products.BigramSplitter-1.0b3-py2.4.egg.

File metadata

File hashes

Hashes for Products.BigramSplitter-1.0b3-py2.4.egg
Algorithm Hash digest
SHA256 9eb25e2b434dd727807115fa2acc9d33c7a21ca97a19091fdb3fd4090276b231
MD5 7b1846decbdbbcbd79e8cef01f6b0bfc
BLAKE2b-256 97baaf5bae480e8b43574c371d15ff8288f33ef1d200b6081c005adcce18b008

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page