Skip to main content
Help us improve Python packaging – donate today!

Super compact Japanese tokenizer

Project Description

TinySegmenter
----------

TinySegmenter -- Super compact Japanese tokenizer was originally created by
(c) 2008 Taku Kudo for javascript under the terms of a new BSD licence.
For details, see [here](http://lilyx.net/pages/tinysegmenter_licence.txt)

tinysegmenter for python2.x was written by Masato Hagiwara.
for his information see [here](http://lilyx.net/pages/tinysegmenterp.html)

This tinysegmenter is modified for python3.x and python2.x for distribution by Tatsuro Yasukawa.

See info about [tinysegmenter](https://github.com/SamuraiT/tinysegmenter)

Installation
------------

```
pip install tinysegmenter3
```

Usage
----------

```
from tinysegmenter import TinySegmenter
segmenter = TinySegmenter()
statement = '私はpython大好きStanding Engineerです.'
tokenized_statement = segmenter.tokenize(statement)
print(tokenized_statement)
# ['私', 'は', 'python', '大好き', 'Standing', ' Engineer', 'です', '.']
```

Release history Release notifications

History Node

0.1.0

This version
History Node

0.0.3

History Node

0.0.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
tinysegmenter3-0.0.3.tar.gz (10.4 kB) Copy SHA256 hash SHA256 Source None Jul 14, 2014

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging CloudAMQP CloudAMQP RabbitMQ AWS AWS Cloud computing Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page