Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.13.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 726569ccbebdd43bcff5b3bd1e7da6f082605b1f3cc27c8da7468895b5d65334 |
|
MD5 | 5d01690ee4e58bdc45a5b5e14502cf8e |
|
BLAKE2b-256 | c1968639c79f3db0b8ab1e92ecd615fa802c16b5a8e30bb5b438fc1f2ef83ea1 |
Close
Hashes for CTH_sentence_split-0.0.13-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c657d24b63495541e2a30e9b7b888aa6ca4bc61a5a76209021f954db6f28b1b5 |
|
MD5 | 7c5a4dc3934a02995b6b6427ec331c24 |
|
BLAKE2b-256 | 0b9bc94e2085533aefd292611743f6b32cc4ef69d79c370a69525194bcded3c9 |