A tokenizer focused on Spanish language.
Project description
IAR Tokenizer
The IAR (Iván Arias RodrÃguez) Tokenizer is a tokenizer developed mainly for Spanish. It is able to divide a text in paragraphs, those in sentences, and each sentence in a list of tokens.
More information to be added in the future...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
Close
Hashes for iar_tokenizer-1.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4c394ee333d4dd0c26bf2a6573a1975fe7bf8416a0c01715a19b1f8dbc378353 |
|
MD5 | e068e32b3baea20b7f89824a2d36479c |
|
BLAKE2b-256 | 37c84b8e37d27cc2041a3920e1ca3526eec1d21ecbd9668d9017d7a5d889b5c1 |