Chinese (Traditional), Taiwanese and Hakka's sentence split tool.
Project description
CTH_sentence_split
Info
This program is a system that you can split a sentence with Chinese (Traditional), Taiwanese and Hakka's dictionary. It will return a list.
LICENSE
This program in licensed under the GNU AGPLv3 or later.
You should have received a copy of the GNU Affero General Public License v3.0 along with this program.
If not,see https://www.gnu.org/licenses/agpl-3.0-standalone.html.
Install
Install with pip3 install -U CTH_sentence_split
Use
import CTH_sentence_split as sp
sp.split("sentence you want to split")
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for CTH_sentence_split-0.0.20.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | b10e91821be422c2fc958c30da55401de2d55a7b14f7dcff6196df167b73e668 |
|
MD5 | 9b361425c41a533a3a84c68a5699b50d |
|
BLAKE2b-256 | da3b1d2cadf2e786eb1d5f776362dc18e542f54ce65437a0803db83148d9758a |
Close
Hashes for CTH_sentence_split-0.0.20-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 15050d1f8fcd5bef80efc3439ec1655ba0a20f90e776dd83a8322ff6d2bbf03e |
|
MD5 | 81dba431e40767f54dc6b75ffc4f4944 |
|
BLAKE2b-256 | f2957161065f4c7dcf24631bbee4faf62ced72987a4c186b4514259f077aed64 |