Skip to main content

Diverse and extensible generation decoding libraries for transformers.

Project description

Decoders for 🤗 transformers

This package provides a convenient interface for extensible and customizable generation strategies -aka decoders- in 🤗 transformers.

It also provides extra implementations out of the box, like the Stochastic Beam Search decoder.

Installation

pip install decoders

Usage

Simple use of the new interface:

from decoders import inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)
model.generate(...)

Decoders

Stochastic Beam Search

This decoder is a stochastic version of the Beam Search decoder. It is a HF implementation of the paper Stochastic Beam Search.

It can be used as follows:

from decoders import StochasticBeamSearchDecoder, inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)

decoder = StochasticBeamSearchDecoder()
outputs = model.generate(input_ids, generation_strategy=decoder, 
                         num_beams=4, num_return_sequences=4, # sample without repl. = return all beams
                         length_penalty=0.0,  # for correct probabilities, disable length penalty
                         return_dict_in_generate=True, output_scores=True, early_stopping=True,
                         # early stopping because without length penalty, we can discard worse sequences
                         # return_dict_in_generate and output_scores are required for sbs for now,
                         # as scores keep the past generated gumbel noise, which is used by the logits processor
                         )

Note that when sampling without replacement, you must set num_beams and num_return_sequences to the same value, the number of SWOR samples that you want to obtain.

Of course, the samples for the same input are not independent. If you want R different groups of SWOR samples of size n, you should replicate your batched input tensor by R, and then set num_beams and num_return_sequences to n.

See here for a full example.

Included goodies

BinaryCodeTransformer

The BinaryCodeTransformer is a custom transformer model that acts like a probabilistic binary sequence generator. Given a discrete probability distribution over all possible binary sequences of a given length, it generates a sequence of that length according to that distribution. It is useful to test HF compatible sample-without-replacement decoders, like the Stochastic Beam Search decoder.

The code maps each of the 2^n possible binary sequences of length n to its positive integer decimal representation. Then, it uses that number as the index of the corresponding probability in the input distribution. Since we are interested in autoregressive generation, the model computes the conditional probabilities by summing over the possible continuations of the sequence.

FakeTransformer

The FakeTransformer operates as a very simple Probabilistic Finite State Automaton. See here for a full explanation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

decoders-0.0.12.tar.gz (86.2 kB view details)

Uploaded Source

Built Distribution

decoders-0.0.12-py3-none-any.whl (108.2 kB view details)

Uploaded Python 3

File details

Details for the file decoders-0.0.12.tar.gz.

File metadata

  • Download URL: decoders-0.0.12.tar.gz
  • Upload date:
  • Size: 86.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for decoders-0.0.12.tar.gz
Algorithm Hash digest
SHA256 c3ad73868c2ddd569965888ec441e1dcef5376bc96d07c17c382e67eec350a66
MD5 3f5580dc1d51ce692aecdd419e050a18
BLAKE2b-256 5571c7611d55877cdafde8c4a4614205114863f13df4ee02e418b2446d74238b

See more details on using hashes here.

Provenance

File details

Details for the file decoders-0.0.12-py3-none-any.whl.

File metadata

  • Download URL: decoders-0.0.12-py3-none-any.whl
  • Upload date:
  • Size: 108.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for decoders-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 1213905948c76533b36dfa57a5113d7054669c34a885fe8912006f61e168f2ce
MD5 f9461db831150aaa39ace15675133f37
BLAKE2b-256 5521163cacab543b7d66cd1176f8736de268447e59be37780a2bf270c2089fe6

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page