Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
pip3 install -U CTH_sentence_split
import CTH_sentence_split as sp sp.split("sentence you want to split")
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size CTH_sentence_split-0.0.24.tar.gz (5.4 kB)||File type Source||Python version None||Upload date||Hashes View|
|Filename, size CTH_sentence_split-0.0.24-py3-none-any.whl (18.6 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
Hashes for CTH_sentence_split-0.0.24.tar.gz
Hashes for CTH_sentence_split-0.0.24-py3-none-any.whl