A tokenizer focused on Spanish language.
The IAR (IvÃ¡n Arias RodrÃguez) Tokenizer is a tokenizer developed mainly for Spanish. It is able to divide a text in paragraphs, those in sentences, and each sentence in a list of tokens.
More information to be added in the future...
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size iar_tokenizer-1.0.10-py3-none-any.whl (21.6 kB)||File type Wheel||Python version py3||Upload date||Hashes View hashes|
|Filename, size iar_tokenizer-1.0.10.tar.gz (11.2 kB)||File type Source||Python version None||Upload date||Hashes View hashes|
Hashes for iar_tokenizer-1.0.10-py3-none-any.whl