Skip to main content

bytepiece-rs Python binding

Project description

rs-bytepiece

Install

pip install rs_bytepiece

Usage

from rs_bytepiece import Tokenizer

tokenizer = Tokenizer()
# a custom model
tokenizer = Tokenizer("/path/to/model")
ids = tokenizer.encode("今天天气不错")
text = tokenizer.decode(ids)

Performance

The performance is a bit faster than the original implementation. I've tested (on my M2 16G) the《鲁迅全集》which has 625890 chars. The time unit is millisecond.

length jieba aho_py aho_cy aho_rs
100 17062.12 1404.37 564.31 112.94
1000 17104.38 1424.6 573.32 113.18
10000 17432.58 1429.0 574.93 110.03
100000 17228.17 1401.01 574.5 110.44
625890 17305.95 1419.79 567.78 108.54

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rs_bytepiece-0.2.2.tar.gz (1.2 MB view hashes)

Uploaded Source

Built Distributions

rs_bytepiece-0.2.2-cp37-abi3-win_amd64.whl (3.8 MB view hashes)

Uploaded CPython 3.7+ Windows x86-64

rs_bytepiece-0.2.2-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.5 MB view hashes)

Uploaded CPython 3.7+ manylinux: glibc 2.17+ x86-64

rs_bytepiece-0.2.2-cp37-abi3-macosx_10_7_x86_64.whl (2.3 MB view hashes)

Uploaded CPython 3.7+ macOS 10.7+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page