Skip to main content

A framework for flexibly developing beyond bakcpropagation.

Project description

Zenkai

Zenkai is a framework built on PyTorch for deep learning researchers

  • to explore a wider variety of machine architectures
  • to explore learning algorithms that do not rely on gradient descent

It is fundamentally based on the concepts of target propagation. In target propagation, a targets are propagated to each layer of the network by using an inversion or approximating an inversion operation. Thus, each layer has its own target. While Zenkai allows for more than just using target propagation, it is based on the concept of each layer having its own target.

Installation

pip install zenkai

Brief Overview

Zenkai consists of several packages to more flexibly define and train deep learning machines beyond what is easy to do with Pytorch.

  • zenkai: The core package. It contains all modules necessary for defining a learning machine.
  • zenkai.tansaku: Package for adding more exploration to learning. Contains framework for defining and creating population-based optimizers.
  • zenkai.ensemble: Package used to create ensemble models within the Zenkai framework.
  • zenkai.targetprop Package used for creating systems that use target propagation.
  • zenkai.feedback Package for performing various kinds of feedback alignment.
  • zenkai.scikit Package wrapping scikit learn to create Zenkai LeraningMachines using scikit learn modules.
  • zenkai.utils: Utils contains a variety of utility functions that are used by the rest of the application. For example utils for getting and setting parameters or gradients.

Further documentation is available at https://zenkai.readthedocs.io

Usage

Zenkai's primary feature is the "LearningMachine" which aims to make defining learning machines flexible. The design is similar to Torch, in that there is a forward method, a parameter update method similar to accGradParameters(), and a backpropagation method similar to updateGradInputs(). So the primary usage will be to implement them.

Here is a (non-working) example

class MyLearner(zenkai.LearningMachine):
    """A LearningMachine couples the learning mechanics for the machine with its internal mechanics."""

    def __init__(
        self, module: nn.Module, step_theta: zenkai.StepTheta, 
        step_x: StepX, loss: zenkai.Loss
    ):
        super().__init__()
        self.module = module
        # step_theta is used to update the parameters of the
        # module
        self._step_theta = step_theta
        # step_x is used to update the inputs to the module
        self._step_x = step_x
        self.loss = loss

    def step(
        self, x: IO, t: IO, state: State, **kwargs
    ):
        # use to update the parameters of the machine
        # x (IO): The input to update with
        # t (IO): the target to update
        # outputs for a connection of two machines
        return self._step_theta(x, t)

    def step_x(
        self, x: IO, t: IO, state: State, **kwargs
    ) -> IO:
        # use to update the target for the machine
        # step_x is analogous to updateGradInputs in Torch except
        # it calculates "new targets" for the incoming layer
        return self._step_x(x, t)

    def forward_nn(self, x: zenkai.IO, state: State) -> zenkai.IO:
        return self.module(x.f)


my_learner = MyLearner(...)

zenkai.set_lmode(my_learner, zenkai.LMode.WithStep)

for x, t in dataloader:
    # set the "learning mode" to perform a step
    loss(my_learner(x), t).backward()

Learning machines can be stacked by making use of step_x in the training process.

Note: Since Zenkai has been set up to make use of Torch's backpropagation skills

class MyMultilayerLearner(LearningMachine):
    """A LearningMachine couples the learning mechanics for the machine with its internal mechanics."""

    def __init__(
        self, layer1: LearningMachine, layer2: LearningMachine
    ):
        super().__init__()
        self.layer1 = layer1
        self.layer2 = layer2

        # use these hooks to indicate a dependency on another method
        self.add_step(StepXDep(self, 't1'))
        self.add_step_x(ForwardDep(self, 'y1'))

    def step(
        self, x: IO, t: IO, state: State, **kwargs
    ):
        # use to update the parameters of the machine
        # x (IO): The input to update with
        # t (IO): the target to update
        # outputs for a connection of two machines
        
        self.layer2.step(state._y1, t, state.sub('layer2'))
        self.layer1.step(state._y2, state._t1, state.sub('layer1'))

    def step_x(
        self, x: IO, t: IO
    ) -> IO:
        # use to update the target for the machine
        # it calculates "new targets" for the incoming layer
        t1 = state._t1 = self.layer2.step_x(state._y1, t, state.sub('layer1'))
        return self.layer1.step_x(x, t1, state.sub('layer1'))

    def forward_nn(self, x: zenkai.IO, state: State) -> zenkai.IO:

        # define the state to be for the self, input pair
        x = state._y1 = self.layer1(x, state.sub('layer1'))
        x = state._y2 = self.layer2(x, state.sub('layer2'))
        return x

my_learner = MyLearner(...)
zenkai.set_lmode(my_learner, zenkai.LMode.WithStep)

for x, t in dataloader:
    assessment = loss(my_learner(x), t)
    loss.backward()

Build documentation

cd docs make clean sphinx-autogen source/api.rst make html

Tutorials

Tutorials are being made available at https://github.com/short-greg/zenkai_tutorials .

Contributing

To contribute to the project

  1. Fork the project
  2. Create your feature branch
  3. Commit your changes
  4. Push to the branch
  5. Open a pull request

License

This project is licensed under the MIT License - see the LICENSE.md file for details.

Citing this Software

If you use this software in your research, we request you cite it. We have provided a CITATION.cff file in the root of the repository. Here is an example of how you might use it in BibTeX:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zenkai-0.0.9.tar.gz (63.7 kB view details)

Uploaded Source

Built Distribution

zenkai-0.0.9-py3-none-any.whl (80.4 kB view details)

Uploaded Python 3

File details

Details for the file zenkai-0.0.9.tar.gz.

File metadata

  • Download URL: zenkai-0.0.9.tar.gz
  • Upload date:
  • Size: 63.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.12

File hashes

Hashes for zenkai-0.0.9.tar.gz
Algorithm Hash digest
SHA256 23f780e7d25d258fd5a78262c2bebd5404e8bc810a8c474e84686a208fcdf030
MD5 717c524b55596417c1ffea7611180524
BLAKE2b-256 ccd643b3dfcb0463bd07f50bb432ff1c2a80acc89199e5c6991b7d965974b567

See more details on using hashes here.

File details

Details for the file zenkai-0.0.9-py3-none-any.whl.

File metadata

  • Download URL: zenkai-0.0.9-py3-none-any.whl
  • Upload date:
  • Size: 80.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.12

File hashes

Hashes for zenkai-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 a3aff59de9b786947f0d9f2d4c1a7f2e640f16ba50aa8e22540c14430d500e9c
MD5 612e651a9d2393e6ad748e20c2118184
BLAKE2b-256 bab4214edb9f68e40c758ea976174de67b49695966020773d555af3d5bb8110f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page