A tokenizer focused on Spanish language.
Project description
IAR Tokenizer
The IAR (Iván Arias RodrÃguez) Tokenizer is a tokenizer developed mainly for Spanish. It is able to divide a text in paragraphs, those in sentences, and each sentence in a list of tokens.
More information to be added in the future...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size | File type | Python version | Upload date | Hashes |
---|---|---|---|---|
Filename, size iar_tokenizer-1.0.10-py3-none-any.whl (21.6 kB) | File type Wheel | Python version py3 | Upload date | Hashes View |
Filename, size iar_tokenizer-1.0.10.tar.gz (11.2 kB) | File type Source | Python version None | Upload date | Hashes View |
Close
Hashes for iar_tokenizer-1.0.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bc28adc449f9afcfabf6e439e87064e737129c19bec3dc6dbc4989d894afae6d |
|
MD5 | 20fa152d03875c1488c7a8404707ee9e |
|
BLAKE2-256 | 4cf5680c56f689d67d6471469cca6d4be7a10673966204dd47ccf3901b136e48 |