Yoctol Natural Language Tokenizer
Project description
# tokenizer-hub
yoctol 乂卍oO煞氣ㄟtokenizerOo卍乂
Tokenizers have the same interface of Jieba:
`python from tokenizer_hub import XXX_tokenizer tokenizer = XXX_tokenizer() tokenizer.lcut('我来到北京清华大学') > ['我', '来到', '北京', '清华大学'] `
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenizer-hub-0.0.1.tar.gz
(7.1 kB
view details)
Built Distribution
File details
Details for the file tokenizer-hub-0.0.1.tar.gz
.
File metadata
- Download URL: tokenizer-hub-0.0.1.tar.gz
- Upload date:
- Size: 7.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.5.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c82fb39074b93324ef95bd6b642976558bf85aa0ef802da3f3897eecbf56e79 |
|
MD5 | 06a4e8e879abd72b9fff5614c397ed85 |
|
BLAKE2b-256 | 45c244cf09f4f98f949cd472d694df079d3c092999536b2fbb9d95efe06fec68 |
File details
Details for the file tokenizer_hub-0.0.1-py3-none-any.whl
.
File metadata
- Download URL: tokenizer_hub-0.0.1-py3-none-any.whl
- Upload date:
- Size: 13.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.5.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d22b00d0890736d21b983b201ba4b25532c982fdb64bf610281ede7802872812 |
|
MD5 | 60c7e39ed4ded7ad29f08410fa32d87f |
|
BLAKE2b-256 | ac7b09978044a1f55d4ba74068473c57399d8d3ca11d08698dcd988d835f77ca |