Skip to main content

A tokenizer focused on Spanish language.

Project description

IAR Tokenizer

The IAR (Iván Arias Rodríguez) Tokenizer is a tokenizer developed mainly for Spanish. It is able to divide a text in paragraphs, those in sentences, and each sentence in a list of tokens.

The creation of this software was supported by the Spanish Ministry of Education, Culture and Sport via a doctoral grant to Iván Arias Rodríguez (FPU16/04039). It has also been funded by research projects TIN2014-52010-R (RedR+Human) and TIN2017-88092 R (CetrO+Spec).

You can install this software with pip :

pip install iar-tokenizer

If you change or adapt a function, change its name (for example add your initial after the name)

License: CC BY-NC-SA 4.0 This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. http://creativecommons.org/licenses/by-nc-sa/4.0/

This code is given as is without warranty of any kind. In no event shall the authors or copyright holder be liable for any claim damages or other liability.


El IAR (Iván Arias Rodríguez) Tokenizer es un tokenizador desarrollado principalmente para la lengua española. Es capaz de dividir un texto en párrafo, estos en frases, y cada frase en una lista de tokens.

La creación de este software fue apoyada por el Ministerio de Educación, Cultura y Deporte de España, a través de una beca doctoral otorgada a Iván Arias Rodríguez (FPU16/04039). También ha sido financiado por los proyectos de investigación TIN2014-52010-R (RedR+Human) y TIN2017-88092 R (CetrO+Spec).

Puedes instalar este software con pip :

pip install iar-tokenizer

Si cambias o adaptas una función, cambia su nombre (por ejemplo añade tus iniciales tras el nombre)

License: CC BY-NC-SA 4.0 Este trabajo se ofrece bajo la licencia Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. http://creativecommons.org/licenses/by-nc-sa/4.0/

Este código se ofrece como es sin ninguna garantía de ningún tipo. En ningún caso los autores o el titular de los derechos de autor serán responsables de cualquier reclamación por daños y perjuicios u otra responsabilidad.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iar-tokenizer-1.0.12.tar.gz (14.1 kB view details)

Uploaded Source

Built Distribution

iar_tokenizer-1.0.12-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file iar-tokenizer-1.0.12.tar.gz.

File metadata

  • Download URL: iar-tokenizer-1.0.12.tar.gz
  • Upload date:
  • Size: 14.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.5.0.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.7.2

File hashes

Hashes for iar-tokenizer-1.0.12.tar.gz
Algorithm Hash digest
SHA256 a6be88ae33fc3217a5819063208827dd64a7300af747c98382dc03576ff06616
MD5 cf64bb8cc619b9695e6a2581a20f67bf
BLAKE2b-256 fb21b7e5656a61ef0659b25e52d9a5efb2ab38500ba2845daf877a989fd5e3ae

See more details on using hashes here.

File details

Details for the file iar_tokenizer-1.0.12-py3-none-any.whl.

File metadata

  • Download URL: iar_tokenizer-1.0.12-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.5.0.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.7.2

File hashes

Hashes for iar_tokenizer-1.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 839e56d5bf1c1cd64e4856997738ff84d0f0fd3b67d2a3538db6743cea7175a4
MD5 9bbb2658318e5fa560b2b24ff22a26db
BLAKE2b-256 15b667f4fe48c064e844e7f6792927903f8b50f2bd64da2a492d6e31417d38b5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page