Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
CTH_sentence_split-0.0.4b0.tar.gz
(17.1 kB
view hashes)
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.4b0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | a02bf44a408eea46a0f06ff9cfd6d77b37f299d83cbc5de53ebd5ce358c709a3 |
|
MD5 | 9f334404ad7f9a8bb806e33ce9c9c6aa |
|
BLAKE2b-256 | 62d412ad2c49e39561bc2c502eb4d59bdefbb635ef04fe26042472bf29dae902 |
Close
Hashes for CTH_sentence_split-0.0.4b0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8db167a651cf275f064e4c68598ef3e432e3e7e4e7c21715a3976960cdad5e51 |
|
MD5 | be90eb24b61c16032924f93678ae8d7d |
|
BLAKE2b-256 | b8fb8bef61fc28ccbc31c388728508f72969509b9567c909e55c46120c7db610 |