Skip to main content

Grapheme Pair Encoding Tokenizer for Sinhala Language

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Installation

pip install gpe-tokenizer

Basic Usage

from gpe_tokenizer import SinhalaGPETokenizer

Model Compatibility

For BERT

tokenizer = SinhalaGPETokenizer(model='bert')

For llama

tokenizer = SinhalaGPETokenizer(model='llama')

For GPT

tokenizer = SinhalaGPETokenizer(model='gpt')

Tokenize

tokenizer.tokenize(text)

Tokenizer Training Details

Corpus Size: 10 Million Sentences

Vocab Size: 32000

Training Time: 13H 29M

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gpe_tokenizer-0.1.3.tar.gz (581.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gpe_tokenizer-0.1.3-py3-none-any.whl (585.4 kB view details)

Uploaded Python 3

File details

Details for the file gpe_tokenizer-0.1.3.tar.gz.

File metadata

  • Download URL: gpe_tokenizer-0.1.3.tar.gz
  • Upload date:
  • Size: 581.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gpe_tokenizer-0.1.3.tar.gz
Algorithm Hash digest
SHA256 73f9fdeb585f63c18b3daa6e1cc23c48f77459a65b6f1820abb49a7f42370e29
MD5 0455e9c4ad55edadd7bd280ab5d831ef
BLAKE2b-256 43e0361d9ec43a6fbae2220bcf3237ef4f5457ffe0705afece0823b601d35e4d

See more details on using hashes here.

Provenance

The following attestation bundles were made for gpe_tokenizer-0.1.3.tar.gz:

Publisher: python-publish.yml on Schizo00/GPE_Tokenizer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gpe_tokenizer-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: gpe_tokenizer-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 585.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gpe_tokenizer-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 724d5064bc988b059529fe234a8b21bb4aa1ff28bb97cb8f3ae425792d49b618
MD5 157a9ccde2318f0ed5e0ca267a797582
BLAKE2b-256 2461d6719266acd2a3015dd5436eae701bcaef2b6908254273aeb2fc850488c3

See more details on using hashes here.

Provenance

The following attestation bundles were made for gpe_tokenizer-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on Schizo00/GPE_Tokenizer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page