Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.10.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | a161b41319073a9a4229b1e7dee1274381e488d55f196873a14aef38c778d0bb |
|
MD5 | 0faf216da6a2e9c90f58279279884fe1 |
|
BLAKE2b-256 | dd59efa643634df899102f8ac0d43c77fd9e319e7ccb7100a3558c6b70d1d558 |
Close
Hashes for CTH_sentence_split-0.0.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c37bcf1b870e329a81cab5bd8aadff5494b5c2d0f985c3c7619e39fe60d20874 |
|
MD5 | d2aa849e9c1783cea6020489b8493160 |
|
BLAKE2b-256 | 62bf132aabcdf60d4e7f48dc0c48f4b9d1f771ad6bd0e801dc55f5d25139f12d |