NLP tools
Project description
HelloNLP,NLP with deep learning. Mainly focus on text classification, NER, Chabot , pre-trained models.
GitHub: https://github.com/hellonlp
HelloNLP
ChineseWordSegmentation:一种无监督的分词工具,通过信息熵实现。
新分词:有监督,通过深度学习(同时引入结巴分词多种分词模式的思维)# 开发中
Example
Quick start
>>> pip3 install hellonlp from hellonlp.ChineseWordSegmentation import segment_entropy words = segment_entropy.get_words(["HelloNLP会一直坚持开源和贡献", "HelloNLP专注于NLP技术", "HelloNLP第一版终于发布了,太激动了", "HelloNLP目前支持无监督的分词", "HelloNLP之后还会支持深度学习的分词", "HelloNLP目前只支持python",]) print(words[:10])
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
hellonlp-0.2.20.tar.gz
(1.5 MB
view hashes)
Built Distribution
hellonlp-0.2.20-py2-none-any.whl
(13.2 kB
view hashes)
Close
Hashes for hellonlp-0.2.20-py2-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 786c03463e0528795c1fab095057604f5e2d657b6c0e32b4920cbd0ade08c9d8 |
|
MD5 | 96a8641230e922fa5d89615201ee8c26 |
|
BLAKE2b-256 | d2fd63a86e7ffdabd6cffe7305af713e07cd76b93d5dedbaa99bec88f91f80ee |