Skip to main content

Diverse and extensible generation decoding libraries for transformers.

Project description

Decoders for 🤗 transformers

This package provides a convenient interface for extensible and customizable generation strategies -aka decoders- in 🤗 transformers.

It also provides extra implementations out of the box, like the Stochastic Beam Search decoder.

Installation

pip install decoders

Usage

Simple use of the new interface:

from decoders import inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)
model.generate(...)

Decoders

Stochastic Beam Search

This decoder is a stochastic version of the Beam Search decoder. It is a HF implementation of the paper Stochastic Beam Search.

It can be used as follows:

from decoders import StochasticBeamSearchDecoder, inject_supervitamined_decoders
from transformers import T5ForConditionalGeneration

model = T5ForConditionalGeneration.from_pretrained('t5-small')
inject_supervitamined_decoders(model)

decoder = StochasticBeamSearchDecoder()
outputs = model.generate(input_ids, generation_strategy=decoder, 
                         num_beams=4, num_return_sequences=4, # sample without repl. = return all beams
                         length_penalty=0.0,  # for correct probabilities, disable length penalty
                         return_dict_in_generate=True, output_scores=True, early_stopping=True,
                         # early stopping because without length penalty, we can discard worse sequences
                         # return_dict_in_generate and output_scores are required for sbs for now,
                         # as scores keep the past generated gumbel noise, which is used by the logits processor
                         )

Note that when sampling without replacement, you must set num_beams and num_return_sequences to the same value, the number of SWOR samples that you want to obtain.

Of course, the samples for the same input are not independent. If you want R different groups of SWOR samples of size n, you should replicate your batched input tensor by R, and then set num_beams and num_return_sequences to n.

See here for a full example.

Included goodies

BinaryCodeTransformer

The BinaryCodeTransformer is a custom transformer model that acts like a probabilistic binary sequence generator. Given a discrete probability distribution over all possible binary sequences of a given length, it generates a sequence of that length according to that distribution. It is useful to test HF compatible sample-without-replacement decoders, like the Stochastic Beam Search decoder.

The code maps each of the 2^n possible binary sequences of length n to its positive integer decimal representation. Then, it uses that number as the index of the corresponding probability in the input distribution. Since we are interested in autoregressive generation, the model computes the conditional probabilities by summing over the possible continuations of the sequence.

FakeTransformer

The FakeTransformer operates as a very simple Probabilistic Finite State Automaton. See here for a full explanation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

decoders-0.1.1.tar.gz (86.4 kB view details)

Uploaded Source

Built Distribution

decoders-0.1.1-py3-none-any.whl (110.0 kB view details)

Uploaded Python 3

File details

Details for the file decoders-0.1.1.tar.gz.

File metadata

  • Download URL: decoders-0.1.1.tar.gz
  • Upload date:
  • Size: 86.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for decoders-0.1.1.tar.gz
Algorithm Hash digest
SHA256 44a57c53c314d9f2d96e5e3abbb20a59b990f7a430f5da71498c5d9a34533fb1
MD5 d74dde22a717435146fcdf36f3bc4d4c
BLAKE2b-256 81d64c3ff2b61f49a265b2e895fdb650c8a9d82aba4c8646676a15340b22f6bf

See more details on using hashes here.

Provenance

File details

Details for the file decoders-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: decoders-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 110.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for decoders-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8e2b34bcd527f083e1b65102dc7c68003d6141422a7bbe9f1feb437fe7137d5d
MD5 1f1ea7641d88cf1f8fb7b93c79b73891
BLAKE2b-256 f68fd14fcac8676cac8caebc17249593e0b2b14d381ed909c7f3a3b759cd0d4c

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page