Chinese Words Segmentation Utilities
Project description
我家还蛮大的, 欢迎你们来我家van.
new-words-detection
"新词发现", 目前仅有基于苏神(苏剑林)文章写的左右熵以及互信息方法进行词库建设,其他的算法后面有空会继续更新
具体链接参考 《新词发现的信息熵方法与实现》
完整文档见 README.md
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for new-words-detection-1.0.3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | ce6ebd53260cd0f9bf19709910f50d46c323a3c3ce1553de00f071991d3f6363 |
|
MD5 | d43ad3ab2bb4c3b6f8ea682cce531896 |
|
BLAKE2b-256 | 4a3dc617575ef221515adf9c64e61a7d5769a79eaaaf1e6c9b5ec1b369cf02da |
Close
Hashes for new_words_detection-1.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4b3bb3a5fe65f5e1940bd3e4bec111c9149e807b050197f36aa84bbc751476b3 |
|
MD5 | 6d8993e714695888b2c747aced23f23d |
|
BLAKE2b-256 | b2c04300986a531ec3707f089e2377d29d22cfe73201f977466cec0ef8289892 |