Tokenize for vocab is created by subwod-nmt
Project description
Genz Tokenize
install via pip (from PyPI):
pip install genz-tokenize
Using
from genz_tokenize import Tokenize
tokenize = Tokenize('vocab.txt', 'bpe.codes')
print(tokenize(['sinh_viên công_nghệ', 'hello'], maxlen = 10))
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
genz-tokenize-1.0.1.tar.gz
(3.9 kB
view details)
Built Distribution
File details
Details for the file genz-tokenize-1.0.1.tar.gz
.
File metadata
- Download URL: genz-tokenize-1.0.1.tar.gz
- Upload date:
- Size: 3.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c76f72c44d0b91293c4a5462218b1c49ce2a490a917cf709c19d8ea16c434a38 |
|
MD5 | 4431cff3d81dd32fc639ee794ffc6009 |
|
BLAKE2b-256 | 98e100892fa81e92b42325179f232a1b1eca41be1cede045d9edfc7ebe224c5d |
File details
Details for the file genz_tokenize-1.0.1-py3-none-any.whl
.
File metadata
- Download URL: genz_tokenize-1.0.1-py3-none-any.whl
- Upload date:
- Size: 4.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b1a1d105577c24ab283d0cc43923df8ffd910dabff78908a54bdc05614f62f8f |
|
MD5 | a110872798e422f70326f9c0632bf125 |
|
BLAKE2b-256 | 41e16147fd5284aa23217ba6b1ac3dbf7de9925d8f8ff231f28bc6eac0a38a5d |