Skip to main content

ailia Tokenizer

Project description

ailia Tokenizer Python API

!! CAUTION !! “ailia” IS NOT OPEN SOURCE SOFTWARE (OSS). As long as user complies with the conditions stated in License Document, user may use the Software for free of charge, but the Software is basically paid software.

About ailia Tokenizer

The ailia Tokenizer is an NLP tokenizer that can be used from Unity or C++. The tokenizer is an API for converting text into tokens (sequences of symbols) that AI can handle, or for converting tokens back into text.

Traditionally, tokenization has been performed using Pytorch's Transformers. However, since Transformers only work with Python, there has been an issue of not being able to tokenize from applications on Android or iOS.

With ailia Tokenizer, this problem is solved by directly performing NLP tokenization without using Pytorch's Transforms. This makes it possible to perform tokenization on Android and iOS as well.

Since ailia Tokenizer includes Mecab and SentencePiece, it is possible to perform complex tokenizations, such as those for BERT Japanese or Sentence Transformer, on the device.

Install from pip

You can install the ailia SDK free evaluation package with the following command.

pip3 install ailia_tokenizer

Install from package

You can install the ailia SDK from Package with the following command.

python3 bootstrap.py
pip3 install ./

API specification

https://github.com/axinc-ai/ailia-sdk

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ailia_tokenizer-1.3.1.0.tar.gz (17.3 MB view details)

Uploaded Source

Built Distribution

ailia_tokenizer-1.3.1.0-py3-none-any.whl (17.5 MB view details)

Uploaded Python 3

File details

Details for the file ailia_tokenizer-1.3.1.0.tar.gz.

File metadata

  • Download URL: ailia_tokenizer-1.3.1.0.tar.gz
  • Upload date:
  • Size: 17.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.6

File hashes

Hashes for ailia_tokenizer-1.3.1.0.tar.gz
Algorithm Hash digest
SHA256 ad0f1154f534668500b525021041656f65b067572086681abce13d1c6d7bacff
MD5 44064416f71bbf71a4a4432ba52a39b1
BLAKE2b-256 38b042bfc4114bbce71c9d48a67bab5f1fa8cd14bd59b7036e82cae69d8627e0

See more details on using hashes here.

File details

Details for the file ailia_tokenizer-1.3.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for ailia_tokenizer-1.3.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2233d2d1d7b2248f47b37c80bc4af4a8203f5311bfd98e0b9bb244125badddb4
MD5 512581cb860850fb5015e11a95749b4d
BLAKE2b-256 5d1a8d189c0cb206e9e54e2a0cd31048393cf28cd6594c458f15707f4275e021

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page