Skip to main content

Tensorflow Keras utilities for reducing boilerplate code.

Project description

Build Status Coverage Status Version Status Python Versions Downloads

params-flow provides an alternative style for defining your Keras model or layer configuration in order to reduce the boilerplate code related to passing and (de)serializing your model/layer configuration arguments.

params-flow encourages this:

import params_flow as pf

class MyDenseLayer(pf.Layer):      # using params_flow Layer/Model instead of Keras ones
  class Params(pf.Layer.Params):   # extend one or more base Params configurations
    num_outputs = None             # declare all configuration arguments
    activation = "gelu"            #   provide or override super() defaults
                                   # do not define an __init__()

  def build(self, in_shape):
    self.kernel = self.add_variable("kernel",
                                    [int(in_shape[-1]),
                                     self.params.num_outputs])     # access config arguments

which would be sufficient to pass the right configuration arguments to the super layer/model, as well as take care of (de)serialization, so you can concentrate on the build() or call() implementations, instead of writing boilerplate code like this:

from tf.keras.layers import Layer

class MyDenseLayer(Layer):
  def __init__(self,
               num_outputs,            # put all of the layer configuration in the constructor
               activation = "gelu",    #     provide defaults
               **kwargs):              # allow base layer configuration to be passed to super
    self.num_outputs = num_outputs
    self.activation = activation
    super().__init__(**kwargs)

  def build(self, in_shape):
    self.kernel = self.add_variable("kernel",
                                    [int(in_shape[-1]),
                                     self.num_outputs])      # access config arguments

  def get_config(self):                # serialize layer configuration, __init__() is the deserializer
    config = {
      'num_outputs': self.num_outputs,
      'activation': self.activation
    }
    base_config = super().get_config()
    return dict(list(base_config.items())) + list(config.items())

NEWS

  • 04.Apr.2020 - refactored to use WithParams mixin from kpe/py-params. Make sure to use _construct() instead of __init__() in your Layer and Model subclasses. Breaking Change - _construct() signature has changed, please update your Layer and Model subclasses from:

    def _construct(self, params: Params):
        ...

    to:

    def _construct(self, **kwargs):
        super()._construct(**kwargs)
        params = self.params
        ...
  • 11.Sep.2019 - LookAhead optimizer wrapper implementation for efficient non eager graph mode execution (TPU) added.

  • 05.Sep.2019 - LookAhead optimizer implementation as Keras callback added.

  • 04.Sep.2019 - RAdam optimizer implementation added.

LICENSE

MIT. See License File.

Install

params-flow is on the Python Package Index (PyPI):

pip install params-flow

Usage

params-flow provides a Layer and Model base classes that help reducing common boilerplate code in your custom Keras layers and models.

When subclassing a Keras Model or Layer, each configuration parameter has to be provided as an argument in __init__(). Keras relies on both __init__() and get_config() to make a model/layer serializable.

While python idiomatic this style of defining your Keras models/layers results in a lot of boilerplate code. params-flow provides an alternative by encapsulating all those __init__() configuration arguments in a dedicated Params instance (Params is kind of a “type-safe” python dict - see kpe/py-params). The model/layer specific configuration needs to be declared as a nested Model.Params/Layer.Params subclass, and your model/layer have to subclass params_flow.Model/params_flow.Layer instead of the Keras ones:

class BertEmbeddingsLayer(Layer):
  class Params(PositionEmbeddingLayer.Params):
    vocab_size              = None
    token_type_vocab_size   = 2
    hidden_size             = 768
    use_position_embeddings = True

class TransformerEncoderLayer(Layer):
  class Params(TransformerSelfAttentionLayer.Params,
               ProjectionLayer.Params):
    intermediate_size       = 3072
    intermediate_activation = "gelu"

this allows you to declare the model’s configuration by simply extending the Params of the underlying layers:

class BertModel(Model):
  class Params(BertEmbeddingsLayer.Params,
               TransformerEncoderLayer.Params):
    pass

N.B. The two code excerpts above are taken from kpe/bert-for-tf2, so check there for the details of a non-trivial params-flow based implementation (of BERT).

Resources

  • kpe/py-params - A “type-safe” dict class for python.

  • kpe/bert-for-tf2 - BERT implementation using the TensorFlow 2 Keras API with the help of params-flow for reducing some of the common Keras boilerplate code needed when passing parameters to custom layers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

params-flow-0.8.2.tar.gz (22.5 kB view details)

Uploaded Source

File details

Details for the file params-flow-0.8.2.tar.gz.

File metadata

  • Download URL: params-flow-0.8.2.tar.gz
  • Upload date:
  • Size: 22.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.1

File hashes

Hashes for params-flow-0.8.2.tar.gz
Algorithm Hash digest
SHA256 f6799a53433af84013f1fc7a5da3864ca341c5c7c7765506aa21314cdb53ddf0
MD5 6be1c75d140491fb9b17644ae22867f0
BLAKE2b-256 a995ff49f5ebd501f142a6f0aaf42bcfd1c192dc54909d1d9eb84ab031d46056

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page