Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.15.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9ed2464895061d91eb7afebc486a0e237844e24efa15f25cd7447e34dda4a44d |
|
MD5 | 29e58a4c9993f3decc4ba88865805283 |
|
BLAKE2b-256 | 0aa4dad50281343e860bc751e9b0c4bbac10c2b43bb8770009d3cb9276ba3a25 |
Close
Hashes for CTH_sentence_split-0.0.15-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 022012766542d3a001464c26bef9a6a68d0390b947e437a6e26367d4346f815d |
|
MD5 | 9f6342917650ae5a293783a62b725e3f |
|
BLAKE2b-256 | 9305a667a0063ff38967be4b1a64823d652c306c49ed71a2071134678375bdc8 |