Text tokenizers.
Project description
totokenizers
A model-agnostic library to encode text into tokens and couting them using different tokenizers.
install
pip install totokenizers
usage
from totokenizers.factories import TotoModelInfo, Totokenizer
model = "openai/gpt-3.5-turbo-0613"
desired_max_tokens = 250
tokenizer = Totokenizer.from_model(model)
model_info = TotoModelInfo.from_model(model)
thread_length = tokenizer.count_chatml_tokens(thread, functions)
if thread_length + desired_max_tokens > model_info.max_tokens:
raise YourException(thread_length, desired_max_tokens, model_info.max_tokens)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
totokenizers-1.5.1.tar.gz
(756.6 kB
view details)
Built Distribution
totokenizers-1.5.1-py3-none-any.whl
(766.8 kB
view details)
File details
Details for the file totokenizers-1.5.1.tar.gz
.
File metadata
- Download URL: totokenizers-1.5.1.tar.gz
- Upload date:
- Size: 756.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 47cfc3dd87b8c44cdde0f79d638f66494e9794275d09016f64f286bf1eb58b29 |
|
MD5 | 2b64af45e9238d255717a74b1e453e85 |
|
BLAKE2b-256 | aa0bb17a36a2db9d2fe0e7f51c93ca57bd10287bf285d785e3925d2621e6f1ac |
Provenance
File details
Details for the file totokenizers-1.5.1-py3-none-any.whl
.
File metadata
- Download URL: totokenizers-1.5.1-py3-none-any.whl
- Upload date:
- Size: 766.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 83ce10f5a538204bf4e5732d2873b8fccc1cb59c2c2fd8126e4b0d64e32c85cc |
|
MD5 | f25daef88eaed26c223f5de66abbb245 |
|
BLAKE2b-256 | 055781921e7c571a2f132c2ac9a9cf017057223d1ac7df0d4061d44f2858d625 |