分词工具
Project description
分词工具
python版(有node版)
安装
pip install nlp_fenci
command-line
搜狗分词
$ sougou_fenci 武汉市长江大桥
武汉市 n
长江 n
大桥 n
百度分词
$ baidu_fenci 武汉市长江大桥
武汉 地名
长江大桥 地名
sougou 分词 code
import sougou_fenci
resp = sougoou_fenci.sougou("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 n
# 长江 n
# 大桥 n
baidu 分词 code
import sougou_fenci
resp = sougoou_fenci.baidu("武汉市长江大桥")
for result_item in resp.result:
print(row_format.format(result_item[0], result_item[1]))
# 结 果
# 武汉市 地名
# 长江大桥 地名
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp_fenci-1.2.3.tar.gz
(2.8 kB
view details)
Built Distribution
File details
Details for the file nlp_fenci-1.2.3.tar.gz
.
File metadata
- Download URL: nlp_fenci-1.2.3.tar.gz
- Upload date:
- Size: 2.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.13.0 setuptools/41.1.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.6.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8df2bb8c1383a3239d9a5c670895ce797b3bd19266c00aed1428d260f9991343 |
|
MD5 | d98a25ef6b75bb622ca9d10f1e585926 |
|
BLAKE2b-256 | 7f0570ae6299ff2b7261823b098fe8fa085206b3ca1ec2bb201a9d2951358e05 |
File details
Details for the file nlp_fenci-1.2.3-py3-none-any.whl
.
File metadata
- Download URL: nlp_fenci-1.2.3-py3-none-any.whl
- Upload date:
- Size: 3.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.13.0 setuptools/41.1.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.6.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b7760145a5586852a25361b14c2c143b289bf60fab303d0f6da2402038e10779 |
|
MD5 | ad5dc527a894b0e2c689b2aa8c9b35f7 |
|
BLAKE2b-256 | 037bb0b1710e0a4bd6d31b3b2e3a3d2c2ab468414c30259b149fc81365efb098 |