Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.22.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4b6fe49fa925142965ecd14d89c144f050c1e0546313efbd6a99f75e67334de0 |
|
MD5 | 1b2aed974d2f0f31d8914012e3bfc2ae |
|
BLAKE2b-256 | 55d9e5b9654666f7420850418d5e43ec0d907f59a6bf51eaef34dbebe7aa1468 |
Close
Hashes for CTH_sentence_split-0.0.22-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f460738ee9d1ca9225130c64991dee69c099306f3d647dff5d9096e12a985726 |
|
MD5 | e3ed76a5fc279b751d2b5d813fd70f60 |
|
BLAKE2b-256 | 451996cc9c6a0229a3d4c1058abef6e93173c8a1a96bcb8a66c83be774549eb1 |