Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.21.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 57cff651f4bdb07ad35fe52c184c5c033d08f92d0baaf5ab022f2acabb00e929 |
|
MD5 | d70f25315f558cf4c9c707c7ce8451be |
|
BLAKE2b-256 | 9abb9ab829a3a01d18c661c7940ebd0e7ac38d926ed7c0f17cd7ff9407791113 |
Close
Hashes for CTH_sentence_split-0.0.21-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a15f1a88f1bd3c0d02d6d72444d19bfeada8647867f63979de222acce5a0f747 |
|
MD5 | 0dbec8cbecef40f8878169340c5bc05f |
|
BLAKE2b-256 | b2193e94b66c5985a1bea17ff3e47b2f7620bc327b8367d1fd6a971afd1a000b |