Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.16.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | cf5e87a2d51cc31c74adaa859b8b14587d5f42650787d52a05ece086e03605cd |
|
MD5 | 7f283a6d59d291bd15124e7688ce07f0 |
|
BLAKE2b-256 | 7b72d825743890db6d9c259fb83afecf1430c7421b090fe5baa6266e76495d27 |
Close
Hashes for CTH_sentence_split-0.0.16-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5a4aa5a24ecf437414dc3b9f5adbcec0a3b29e95ce458351ec4c4e24774c7001 |
|
MD5 | ca0bbe07e27184b4ad9f11bf60fb2fb8 |
|
BLAKE2b-256 | 0529ac232b98261436ce297c9b4298064a35968bacc210bd93a4f2c57732a773 |