Skip to main content

Simple python lib to tokenize texts into sentences and sentences to words. Small, fast and robust. Comes with ukrainian flavour

Project description

Tokenize UK

https://img.shields.io/pypi/v/tokenize_uk.svg https://img.shields.io/travis/lang-uk/tokenize-uk.svg Documentation Status

Simple python lib to tokenize texts into sentences and sentences to words. Small, fast and robust. Comes with ukrainian flavour

Features

  • Tokenize given text into sentences

  • Tokenize given sentence into words

  • Works well with accented characters (like stresses) and apostrophes

  • Suitable also for other languages

History

0.1.0 (2016-05-29)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenize_uk-0.2.0.tar.gz (22.9 kB view details)

Uploaded Source

File details

Details for the file tokenize_uk-0.2.0.tar.gz.

File metadata

  • Download URL: tokenize_uk-0.2.0.tar.gz
  • Upload date:
  • Size: 22.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for tokenize_uk-0.2.0.tar.gz
Algorithm Hash digest
SHA256 be9c043a00d43d2a6fd36fbd57c67b4a1f321fa778db107e9fa92a2eddb7c1fc
MD5 1c1c0f2b33fb272c433a11419dcf5d4b
BLAKE2b-256 ac2172abb0304b532e1b2d2473b50d8063ddd0943e3b3fe7e86b366bc4d02aa2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page