Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.24.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | fb54c115292880dbf0400ab667ba9811130325889b9bf1a737142da5318c3add |
|
MD5 | 56846176e1d08ca77e92d1c188076fe1 |
|
BLAKE2b-256 | b98cb55da20f545b3ebbaf88c4418fae379554c33edf796cca7704e09e476c5b |
Close
Hashes for CTH_sentence_split-0.0.24-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cdd2ba2c3a8450e47a7ba7ab785ed7e0f1e5bb98782fe37a55c6c7af93d89cad |
|
MD5 | 8bc9683f4cb0f2a39d924ef9aeb3e121 |
|
BLAKE2b-256 | c63da35075ea9e8ce1b582c54b164fe0d79e1952bb907eaca43b04fc8a095939 |