A tokenizer focused on Spanish language.
Project description
IAR Tokenizer
The IAR (Iván Arias RodrÃguez) Tokenizer is a tokenizer developed mainly for Spanish. It is able to divide a text in paragraphs, those in sentences, and each sentence in a list of tokens.
More information to be added in the future...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
iar_tokenizer-1.0.6.tar.gz
(11.2 kB
view hashes)
Built Distribution
Close
Hashes for iar_tokenizer-1.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 66985deb46570d23e53e8a3936113c7158ddc948e99745ea8791711c1fc4e725 |
|
MD5 | eacbb8fa2758f4cd61e6f507afe82e3f |
|
BLAKE2b-256 | 5a564802326209df979a705046eeefa3025f6664cfa371a1fcecba0518d7a894 |