Yoctol Natural Language Tokenizer
Project description
# tokenizer-hub
yoctol 乂卍oO煞氣ㄟtokenizerOo卍乂
Tokenizers have the same interface of Jieba:
`python from tokenizer_hub import XXX_tokenizer tokenizer = XXX_tokenizer() tokenizer.lcut('我来到北京清华大学') > ['我', '来到', '北京', '清华大学'] `
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tokenizer-hub-0.0.1.tar.gz
(7.1 kB
view hashes)
Built Distribution
Close
Hashes for tokenizer_hub-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d22b00d0890736d21b983b201ba4b25532c982fdb64bf610281ede7802872812 |
|
MD5 | 60c7e39ed4ded7ad29f08410fa32d87f |
|
BLAKE2b-256 | ac7b09978044a1f55d4ba74068473c57399d8d3ca11d08698dcd988d835f77ca |