ailia Tokenizer
Project description
ailia Tokenizer Python API
!! CAUTION !! “ailia” IS NOT OPEN SOURCE SOFTWARE (OSS). As long as user complies with the conditions stated in License Document, user may use the Software for free of charge, but the Software is basically paid software.
About ailia Tokenizer
The ailia Tokenizer is an NLP tokenizer that can be used from Unity or C++. The tokenizer is an API for converting text into tokens (sequences of symbols) that AI can handle, or for converting tokens back into text.
Traditionally, tokenization has been performed using Pytorch's Transformers. However, since Transformers only work with Python, there has been an issue of not being able to tokenize from applications on Android or iOS.
With ailia Tokenizer, this problem is solved by directly performing NLP tokenization without using Pytorch's Transforms. This makes it possible to perform tokenization on Android and iOS as well.
Since ailia Tokenizer includes Mecab and SentencePiece, it is possible to perform complex tokenizations, such as those for BERT Japanese or Sentence Transformer, on the device.
Install from pip
You can install the ailia SDK free evaluation package with the following command.
pip3 install ailia_tokenizer
Install from package
You can install the ailia SDK from Package with the following command.
python3 bootstrap.py
pip3 install ./
API specification
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ailia_tokenizer-1.4.1.0.tar.gz
.
File metadata
- Download URL: ailia_tokenizer-1.4.1.0.tar.gz
- Upload date:
- Size: 17.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6632ea90e0d7471248793b788b774ea7a971e63a509e6210cd7c6f38735ff76c |
|
MD5 | 5748e19236e98e0b5345c9673526e925 |
|
BLAKE2b-256 | 1a6f03fae078d7b9feb8edd5dc00d01522dd0f0a3ad8217a7505323226cfff65 |
File details
Details for the file ailia_tokenizer-1.4.1.0-py3-none-any.whl
.
File metadata
- Download URL: ailia_tokenizer-1.4.1.0-py3-none-any.whl
- Upload date:
- Size: 17.5 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 629374675c11a80c7cc21411ad40283fa4c0b0a08aa03d41c3946b9bf30c9ef8 |
|
MD5 | e41604eaa7b3dfb39e5ce774f902f6ba |
|
BLAKE2b-256 | 0cb4380fea407859c5f28f5e54ad0f679544f007ab90fe4b2fa7b96929f00dbe |