Skip to main content

Natural language tokenizer for documents in Python

Project description


[![Build Status](]( [![License](](

Natural language tokenizer for English and Japanese documents in Python

## License

[The Unlicense](

Project details

Release history Release notifications

History Node


History Node


History Node


History Node


This version
History Node


History Node


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
nltokeniz-0.0.1-py3.5.egg (4.8 kB) Copy SHA256 hash SHA256 Egg 3.5 Jan 12, 2017
nltokeniz-0.0.1-py3-none-any.whl (4.1 kB) Copy SHA256 hash SHA256 Wheel py3 Jan 12, 2017
nltokeniz-0.0.1.tar.gz (3.0 kB) Copy SHA256 hash SHA256 Source None Jan 12, 2017

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging CloudAMQP CloudAMQP RabbitMQ AWS AWS Cloud computing Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page