Skip to main content

Yoctol Natural Language Tokenizer

Project description

# tokenizer-hub

yoctol 乂卍oO煞氣ㄟtokenizerOo卍乂

Tokenizers have the same interface of Jieba:

`python from tokenizer_hub import XXX_tokenizer tokenizer = XXX_tokenizer() tokenizer.lcut('我来到北京清华大学') > ['我', '来到', '北京', '清华大学'] `

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizer-hub-0.0.0.tar.gz (7.0 kB view details)

Uploaded Source

Built Distributions

tokenizer_hub-0.0.0-py3.5.egg (33.9 kB view details)

Uploaded Source

tokenizer_hub-0.0.0-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file tokenizer-hub-0.0.0.tar.gz.

File metadata

  • Download URL: tokenizer-hub-0.0.0.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.5.2

File hashes

Hashes for tokenizer-hub-0.0.0.tar.gz
Algorithm Hash digest
SHA256 05aa40c27fa89009e636b20851adbe2228e0913b0c3db323247eb8e05931f53b
MD5 655374ed5c8da97510b89fdad1071c41
BLAKE2b-256 b2cb5ad4639661d82024ecfdb43dd6f9972aab7b38b9bb53ab0926b9b36a16e4

See more details on using hashes here.

File details

Details for the file tokenizer_hub-0.0.0-py3.5.egg.

File metadata

  • Download URL: tokenizer_hub-0.0.0-py3.5.egg
  • Upload date:
  • Size: 33.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.5.2

File hashes

Hashes for tokenizer_hub-0.0.0-py3.5.egg
Algorithm Hash digest
SHA256 95a36d198daf0f6adbfb1d135360e048ce7d8070e215b672d00c658ef3793ef7
MD5 029ece854fa9bcbead2bb847c05c9fd2
BLAKE2b-256 39a7b1f394962b4426877b74134877e4893fcd94a6d9412400d18acd3caf4bd3

See more details on using hashes here.

File details

Details for the file tokenizer_hub-0.0.0-py3-none-any.whl.

File metadata

  • Download URL: tokenizer_hub-0.0.0-py3-none-any.whl
  • Upload date:
  • Size: 13.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.5.2

File hashes

Hashes for tokenizer_hub-0.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a96d01e2c4243c5e9716216d8b6765b522949d90ff5af00fa205197881352c32
MD5 75f83be135c66bd8dc89ed7255c5427a
BLAKE2b-256 e06dec703a2d0717cc4c14b556cea01dd0e24cac6dfb5f0f5386ff161973901e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page