Generation Stopping Criteria for transformers Language Model
Project description
gstop
Generation Stopping Criteria for transformers Language Model
Installation
pip install gstop
Usage
from gstop import GenerationStopper, STOP_TOKENS_REGISTRY
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "mistralai/Mistral-7B-v0.1"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
stopper = GenerationStopper(STOP_TOKENS_REGISTRY["mistral"])
input_ids = tokenizer("Hello, world!", return_tensors="pt").input_ids
out = model.generate(input_ids, stopping_criteria=stopper.criteria)
print(stopper.format(tokenizer.decode(out[0])))
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
gstop-0.1.1.tar.gz
(2.4 kB
view hashes)
Built Distribution
gstop-0.1.1-py3-none-any.whl
(2.5 kB
view hashes)