A neural network architecture for building fully explainable neural network for arithmetic and gradient logic expression approximation.
Project description
Bacon-Net
Bacon-Net is a neural network architecture for building fully explainable neural network for arithmetic and gradient logic expression approximation. A Bacon-Net network can be used to discover an arithmetical or a logical expression that approximates the given dataset. And the result network is precisely explainable.
This repository contains a family of 2-variable Bacon-Net implementations. Multiple Bacon-Net can be used together to expand the search space; And Bacon-Net can be stacked into a Bacon-Stack that handles arbitrary number of variables.
The following table presents a list of famous formulas in different fields that are re-discovered using Bacon-Net using synthetic training data. All networks in this repository are implemented using Keras and Python.
Bacon-Poly2
Bacon-Poly2 can be used to discover 1-variable or 2-variable quadratic polynomials or linear polynomials. The following table lists some samples of Bacon-Poly2 re-discovering some of the well-known geometric formulas and physics formulas.
NOTE: Coefficients and constant terms will vary a little in different runs.
Formula | Expression | Bacon-Poly2 Explanation |
---|---|---|
Area of a circle | z = 3.1416x^2 |
|
Area of a ellipse | z = 3.1416xy |
|
Newton's equation of motion (displacement) | z = 4.905x^2 + 5.0x (initial speed = 5, acceleration=9.81) |
|
An arbitary 2-degree 2-variable polynomial expression | z = 1.0x^2 + 4.0xy + 4.0y^2 + 5.0 |
|
Einstein's mass-energy equivalence | z = 0.0899x (with c normalized to 0.299792458 m/s) |
|
Simple intergal | z = 0.5x^2 + <C> (C is determined by training data) |
Bacon-LSP3
Bacon-LSP3 evaluates four possible gradient logic relationships between two variables in space I = [0, 1]: full conjunction, full disjunction, product t-norm (medium hyperconjunction) and neutrality.
Bacon-LSP3 can be used to reason the logic behind some simple decisions, like “a face image needs to show 2 eye features AND a mouth feature”
Relationship | Plot | Bacon-LSP3 Explanation |
---|---|---|
Full conjunction | min(A, B) |
|
Product t-norm | A * B |
|
Neutrality | (A + B) / 2 |
|
Full disjunction | max(A, B) |
Bacon-Stack
Bacon-Stack expands on Bacon-Net and allows arbitrary number of variables. See Back-Stack Architecture for more details on Bacon-Stack design.
Installation
Run the following command to install:
pip install bacon-net
Sample Usage
from nets.poly2 import poly2
# create a network from the bacon-net network family
net = poly2()
# optionally, use dataCreator to generate 3 1-dimension arrays: input a, input b, output (y)
# set singleVariable flag if only a single variable (a) is used
a, b, y = dataCreator.create(1000, 1, lambda a, b: math.pi * a * a, singleVariable=True)
# train the network
net.fit(a, b, y)
# explain the network
m = net.explain(singleVariable=True)
# make prediction (pass two parameters if two variables are used)
p = net.predict(2.4)
# make predictions on array (pass two arrays if two variables are used)
p = net.predict([1.0, 2.3, 4.3])
Developing Bacon-Net
To install Bacon-Net, along with the tools you need to develop and run tests, run the following command:
pip install -e .[dev]
To run all test cases, run the following command from the project's root folder:
pytest
Please see here for instructions on creating a new Bacon-Net network.
Bacon-Net Architecture
The idea behind Bacon-Net is simple: to construct a network that can do linear interpolation among a group of selected terms like min(x,y) and sin(x^2), as shown in the following diagram:
-
Input layer contains two variables. For gradient logic expressions.
-
Expansion layer defines the search space. Each node in this layer represents a candidate expression for the final approximation. Obviously, it’s desirable to have minimum overlaps among the function curves.
-
Interpolation layer creates a linear interpolation of candidate terms from the expansion layer by adjusting weights associated with candidates.
-
Aggregation layer calculate the interpolation result, which is compared to the training data.
It's also to feed the inputs to a family of Bacon-Net networks to search multiple expression spaces in parallel. A 1-active Selection layer is added on top to select the appropriate Bacon-Net in this case, as shown in the following diagram:
Bacon-Stack Architecture
A Bacon-Stack is recursively defined: a Bacon-Stack that handles n variables (denoted as B(n)) is constructed by feeding variable x(i) and the result of a B(n-1) into a B(2) network, which is a Bacon-Net, as shown in the following diagram:
The following diagram illustrates how a 5-variable Bacon-Stack is implemented using Keras custom layers and multi-input features.
Bacon-Stack doesn't assume variables to be commutative. To explore permutation of variable orders, a Permutation layer is added at the bottom of the Bacon-Stack, as shown in the following diagram:
NOTE: Before the permutation layer is implemented, Bacon-Stack is constrained to handle expressions that can be rewrite as binary trees with ordered parameters. For example, the network will have difficulties to understand
(a+b)*c+d*e
, but will be fine with(a+b)*c+d+e
.
Why the name "BACON"?
When I was in high school in encountered with a BASIC program that used a brute-force method to discover a arithmetical expression to approximate a given dataset. I remembered the program was called “BACON”. However, it’s been unfruitful to find such references in Internet, so my memory may have failed me. Regardless, I’ve been wanting to recreate “BACON” all these years, and I finally got around to do it just now during my week off.
As I research into explainable AI, I see an opportunity to combine “BACON” with AI so that we can build some precisely explainable AI networks, plus the benefit of implementing a parallelable, GPU-accelerated BACON using modern technologies.
Upcoming Bacon-Net networks
-
Bacon-Poly3
For degree 3 polynomial expressions
-
Bacon-Trig2
For degree 2 trigonometic functions
-
Bacon-LSP6
Explainable gradient logic network for decision making
-
Bacon-CNN
Explainability layer on top of a CNN network
-
Bacon-H1
A combination of selected Bacon-Net networks
-
Bacon-Cal1
A simple calculus solver
Contact author
- Twitter: @HaishiBai2010
- LinkedIn: Haishi Bai
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file bacon-net-0.1.9.tar.gz
.
File metadata
- Download URL: bacon-net-0.1.9.tar.gz
- Upload date:
- Size: 417.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0769a5f69a28ec94ffa419b78cbedfad9788298270dfee5caec47b9485e589ce |
|
MD5 | f4bc2f3a257a46b5a7147599fe5841fe |
|
BLAKE2b-256 | 4482f915bee1c26ce092843dfc712d8b6a6db815815e691c8df62a5fd68ca6c8 |
File details
Details for the file bacon_net-0.1.9-py3-none-any.whl
.
File metadata
- Download URL: bacon_net-0.1.9-py3-none-any.whl
- Upload date:
- Size: 24.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b588adc077d6fe8c453f0a234e3e5c856c285b2f068e2fb6af8e51e407e7ce30 |
|
MD5 | 0684bee708a3ffb9fbeff29f5064a9bf |
|
BLAKE2b-256 | 807237f3646aee8be622b63d6358d5c9c3050881b935d6514daef8ded42bc03c |