Skip to main content

分词工具

Project description

分词工具

python版(有node版)

安装

pip install nlp_segmentation

command-line

搜狗分词

$ sougou_fenci 武汉市长江大桥
武汉市 n
长江 n
大桥 n

百度分词

$ baidu_fenci 武汉市长江大桥
武汉        地名
长江大桥        地名

sougou 分词 code

import nlp_segmentation
resp = nlp_segmentation.sougou("武汉市长江大桥")
for result_item in resp.result:
    print(row_format.format(result_item[0], result_item[1]))

# 结 果
# 武汉市         n
# 长江         n
# 大桥         n

baidu 分词 code

import nlp_segmentation
resp = nlp_segmentation.baidu("武汉市长江大桥")
for result_item in resp.result:
    print(row_format.format(result_item[0], result_item[1]))

# 结 果
# 武汉市         地名
# 长江大桥         地名

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nlp_segmentation-1.2.7.tar.gz (2.9 kB view hashes)

Uploaded Source

Built Distribution

nlp_segmentation-1.2.7-py3-none-any.whl (5.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page