Skip to main content

Python3 module to tokenize english sentences.

Project description

tokenizesentences

Python3 module to tokenize english sentences. Based on the answer of D Greenberg in StackOverflow: https://stackoverflow.com/questions/4576077/python-split-text-on-sentences

Installation

Install with pip

pip3 install -U tokenizesentences

Usage

In [1]: import tokenizesentences

In [2]: m = tokenizesentences.SplitIntoSentences()

In [3]: m.split_into_sentences(
    "Mr. John Johnson Jr. was born in the U.S.A but earned his Ph.D. in Israel before joining Nike Inc. as an engineer. He also worked at craigslist.org as a business analyst."
    )

Out[3]: 
[
    'Mr. John Johnson Jr. was born in the U.S.A but earned his Ph.D. in Israel before joining Nike Inc. as an engineer.',
    'He also worked at craigslist.org as a business analyst.'
]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for tokenizesentences, version 0.1
Filename, size File type Python version Upload date Hashes
Filename, size tokenizesentences-0.1.tar.gz (2.4 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page