分词工具
Project description
分词工具
python版(有node版)
安装
pip install nlp_fenci
command-line
搜狗分词
$ sougou_fenci 武汉市长江大桥
武汉市 n
长江 n
大桥 n
百度分词
$ baidu_fenci 武汉市长江大桥
武汉长江大桥 地名
sougou 分词 code
import sougou_fenci
resp = sougoou_fenci.sougou("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 n
# 长江 n
# 大桥 n
baidu 分词 code
import sougou_fenci
resp = sougoou_fenci.baidu("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市长江大桥 地名
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp_fenci-1.2.1.tar.gz
(2.8 kB
view hashes)
Built Distribution
Close
Hashes for nlp_fenci-1.2.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5141e9dab669bbae7cc6752a11474bb0a1f8661aa2b79e355c4e912c3c0054bd |
|
MD5 | 94ba29206194292e65a75665849b7bdc |
|
BLAKE2b-256 | 596922f244ec3dfabb017f1f44eb5847e8979fd740975f120364c24ed8a50fe7 |