Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.14.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | f8e21158b52107346a5d83021bc1e8f117653ea604880d8b2e45580dde7c43b6 |
|
MD5 | 60c11b6e2c504cfb690d47c487fc2244 |
|
BLAKE2b-256 | 753d11142edaa37ca983e75bbd8305fe216980c46e87d2c920eed9535426aaab |
Close
Hashes for CTH_sentence_split-0.0.14-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c528d96e37156933fd55948277a0d600fa429068ebf534a5a52f78765fb02cee |
|
MD5 | 41cbab3f026b5586067aefc83ba88a37 |
|
BLAKE2b-256 | 2cc6e66cc6656e33a599cfa9ec62ca7b5765a7ecf32947884084ea1dbb65abf4 |