分词工具
Project description
分词工具
python版(有node版)
安装
pip install sougou_fenci
command-line
搜狗分词
$ sougou_fenci 武汉市长江大桥
武汉市 n
长江 n
大桥 n
百度分词
$ baidu_fenci 武汉市长江大桥
武汉长江大桥 地名
sougou 分词 code
import sougou_fenci
resp = sougoou_fenci.sougou("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 n
# 长江 n
# 大桥 n
baidu 分词 code
import sougou_fenci
resp = sougoou_fenci.baidu("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市长江大桥 地名
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp_fenci-1.2.0.tar.gz
(2.8 kB
view hashes)
Built Distribution
Close
Hashes for nlp_fenci-1.2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8cf5733d9c40227bd78f00b16e057a9d6f250c4ae55a2fbea40a82b570fff6e3 |
|
MD5 | 2e64aa829f9bb5ee9b0ac70467dc4ad1 |
|
BLAKE2b-256 | e61b28cb86387fa21d360c72952f0576a163abfb36812e3bd58c19c961c13b8f |