Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.11.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | e592d0ccc8367d2d0ad60f8797968270270455dc7e277a1672336773abb2cffd |
|
MD5 | 8d9a9403969eba6920b22c756380af1a |
|
BLAKE2b-256 | 05f54c2f31c986ccd3675580d134bb5d8f8c2f03513a9b383b9aad528a65fa25 |
Close
Hashes for CTH_sentence_split-0.0.11-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 43d2d297faa5de2c0322751e5686a59f9be47805a7ee433f3450d48a88efe845 |
|
MD5 | ba3ce0684307291ea186c24576abc7a4 |
|
BLAKE2b-256 | db45e45ad7b9da5291caef30b0d8b71837ebba82d4a9f98682019bfbb2b65e08 |