Skip to main content

Chinese (Traditional), Taiwanese and Hakka's sentence split tool.

Project description




This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.


This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see


Install with pip3 install -U CTH_sentence_split


import CTH_sentence_split as sp
sp.split("sentence you want to split")

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

CTH_sentence_split-0.0.24.tar.gz (5.4 kB view hashes)

Uploaded source

Built Distribution

CTH_sentence_split-0.0.24-py3-none-any.whl (18.6 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page