Chinese Words Segmentation Utilities
Project description
我家还蛮大的, 欢迎你们来我家van.
new-words-detection
"新词发现", 目前仅有基于苏神(苏剑林)文章写的左右熵以及互信息方法进行词库建设,其他的算法后面有空会继续更新
具体链接参考 《新词发现的信息熵方法与实现》
完整文档见 README.md
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for new-words-detection-1.0.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | cf991dc5ca8314f627afc55e78df1ff380d90360e4d40f7ce0e9cd45627fa19b |
|
MD5 | 83d73f89b7a3face86ccd0611dabd08f |
|
BLAKE2b-256 | c60e687cb8bdec759b4714fc325729cfc94727b47e47a43d4b9ab06883300793 |
Close
Hashes for new_words_detection-1.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4238bbdff1ae509fc3137bd1d29422d1c89d8f56a3ddb48763405c5180ebd8e8 |
|
MD5 | 4d803cb9e3c7ba51a48e8e8ddc5142e7 |
|
BLAKE2b-256 | 8ff04de37fd49626dc763ec607eeb46e2bdac3df54f6c9f7fa383c3145e6bb07 |