Skip to main content

Diverse and extensible generation decoding libraries for transformers.

Project description

Decoders for 🤗 transformers

This package provides a convenient interface for extensible and customizable generation strategies -aka decoders- in 🤗 transformers.

It also provides extra implementations out of the box, like the Stochastic Beam Search decoder.

Installation

pip install decoders

Usage

Simple use of the new interface:

from decoders import inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)
model.generate(...)

Decoders

Stochastic Beam Search

This decoder is a stochastic version of the Beam Search decoder. It is a HF implementation of the paper Stochastic Beam Search.

It can be used as follows:

from decoders import StochasticBeamSearchDecoder, inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)

decoder = StochasticBeamSearchDecoder()
outputs = model.generate(input_ids, generation_strategy=decoder, 
                         num_beams=4, num_return_sequences=4, # sample without repl. = return all beams
                         length_penalty=0.0,  # for correct probabilities, disable length penalty
                         return_dict_in_generate=True, output_scores=True, early_stopping=True,
                         # early stopping because without length penalty, we can discard worse sequences
                         # return_dict_in_generate and output_scores are required for sbs for now,
                         # as scores keep the past generated gumbel noise, which is used by the logits processor
                         )

Note that when sampling without replacement, you must set num_beams and num_return_sequences to the same value, the number of SWOR samples that you want to obtain.

Of course, the samples for the same input are not independent. If you want R different groups of SWOR samples of size n, you should replicate your batched input tensor by R, and then set num_beams and num_return_sequences to n.

See here for a full example.

Included goodies

BinaryCodeTransformer

The BinaryCodeTransformer is a custom transformer model that acts like a probabilistic binary sequence generator. Given a discrete probability distribution over all possible binary sequences of a given length, it generates a sequence of that length according to that distribution. It is useful to test HF compatible sample-without-replacement decoders, like the Stochastic Beam Search decoder.

The code maps each of the 2^n possible binary sequences of length n to its positive integer decimal representation. Then, it uses that number as the index of the corresponding probability in the input distribution. Since we are interested in autoregressive generation, the model computes the conditional probabilities by summing over the possible continuations of the sequence.

FakeTransformer

The FakeTransformer operates as a very simple Probabilistic Finite State Automaton. See here for a full explanation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

decoders-0.0.10.tar.gz (86.1 kB view details)

Uploaded Source

Built Distribution

decoders-0.0.10-py3-none-any.whl (108.2 kB view details)

Uploaded Python 3

File details

Details for the file decoders-0.0.10.tar.gz.

File metadata

  • Download URL: decoders-0.0.10.tar.gz
  • Upload date:
  • Size: 86.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for decoders-0.0.10.tar.gz
Algorithm Hash digest
SHA256 6110f15439410faa4039a0d37f457f0a26e99e3b40aec18ba0b70793c68c478c
MD5 634858be19ba8e39c060d991bc444c1c
BLAKE2b-256 49b557fe3abe6f8e36df4a1edea2c413e375d7634b4f3d9955dfff208feccf24

See more details on using hashes here.

Provenance

File details

Details for the file decoders-0.0.10-py3-none-any.whl.

File metadata

  • Download URL: decoders-0.0.10-py3-none-any.whl
  • Upload date:
  • Size: 108.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for decoders-0.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 f8a2c62d4ed44ef4e97e50a6c00673dc0bb70b8f649de0873bcf77e7e4b64eb1
MD5 3c1f58c8dfc12700f4d2d453768ab5f5
BLAKE2b-256 177aa30bc77bcb4034d907812b8f76d0a9439efe0afd8fd9643c7b7467f9039f

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page