分词工具
Project description
分词工具
python版(有node版)
安装
pip install nlp_segmentation
command-line
搜狗分词
$ sougou_fenci 武汉市长江大桥
武汉市 n
长江 n
大桥 n
百度分词
$ baidu_fenci 武汉市长江大桥
武汉 地名
长江大桥 地名
sougou 分词 code
import nlp_segmentation
resp = nlp_segmentation.sougou("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 n
# 长江 n
# 大桥 n
baidu 分词 code
import nlp_segmentation
resp = nlp_segmentation.baidu("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 地名
# 长江大桥 地名
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nlp_segmentation-1.2.7.tar.gz
.
File metadata
- Download URL: nlp_segmentation-1.2.7.tar.gz
- Upload date:
- Size: 2.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: Python-urllib/3.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d61b19d9dfb628e176e8e76ceecc15a3c6b1b77f606bb0db216376558b228d00 |
|
MD5 | 191b8f49f4d065baf375dde5201ed426 |
|
BLAKE2b-256 | 5962cdca47dff14cab0e791e09781e281b027d6dc34718a13c857eb437169ce1 |
File details
Details for the file nlp_segmentation-1.2.7-py3-none-any.whl
.
File metadata
- Download URL: nlp_segmentation-1.2.7-py3-none-any.whl
- Upload date:
- Size: 5.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: Python-urllib/3.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec82c9fb910fd44e402769bae99327c83c96a2e3d0ed9e8ad484f0e9861588cb |
|
MD5 | d107998d99a84501ef98a75db1512a8c |
|
BLAKE2b-256 | 3cbafb4df1faf3688d66988994947fe6cc3151c715b6a2fa99a10baeb169afb7 |