Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.19.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 31f9a4fbd517e2d3aa1b4c34de6d7b12eee47f9959fe1f4a7022fe74e3818c67 |
|
MD5 | 27946b459396e2d521a34cf7bf0e36c1 |
|
BLAKE2b-256 | c1f6b74bce1664d5d071b9c2d87fa964f8e061e59ba57d815349d6b946e1d674 |
Close
Hashes for CTH_sentence_split-0.0.19-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e3ea9b76badfc5bc9d3ea9c8b83162c20973e56a2028b6ae74a6344a4613f6f4 |
|
MD5 | aeab44591081fe132e473b95ef937857 |
|
BLAKE2b-256 | 9c362e26df21019f120ca870d0d983c6267d319b1db6449def027a2c0a21e40a |