Bacon-Net is a neural network architecture for building fully explainable neural network for arithmetic and gradient logic expression approximation. A Bacon-Net network can be used to discover an arithmetical or a logical expression that approximates the given dataset. And the result network is precisely explainable.
This repository contains a family of 2-variable Bacon-Net implementations. Multiple Bacon-Net can be used together to expand the search space; And Bacon-Net can be stacked into a Bacon-Stack that handles arbitrary number of variables.
The following table presents a list of famous formulas in different fields that are re-discovered using Bacon-Net using synthetic training data. All networks in this repository are implemented using Keras and Python.
Bacon-Poly2 can be used to discover 1-variable or 2-variable quadratic polynomials or linear polynomials. The following table lists some samples of Bacon-Poly2 re-discovering some of the well-known geometric formulas and physics formulas.
NOTE: Coefficients and constant terms will vary a little in different runs.
Bacon-LSP3 evaluates four possible gradient logic relationships between two variables in space I = [0, 1]: full conjunction, full disjunction, product t-norm (medium hyperconjunction) and neutrality.
Bacon-LSP3 can be used to reason the logic behind some simple decisions, like “a face image needs to show 2 eye features AND a mouth feature”
Relationship | Plot | Bacon-LSP3 Explanation |
---|---|---|
Full conjunction | min(A, B) |
|
Product t-norm | A * B |
|
Neutrality | (A + B) / 2 |
|
Full disjunction | max(A, B) |
Bacon-Stack expands on Bacon-Net and allows arbitrary number of variables. See Back-Stack Architecture for more details on Bacon-Stack design.
Run the following command to install:
pip install bacon-net
from bacon.net import dataCreator
from bacon.nets.poly2 import poly2
# create a network from the bacon-net network family
net = poly2()
# optionally, use dataCreator to generate training data
# set params to 1 if only a single variable (a) is used
x, y = dataCreator.create(1000, 1, lambda x: math.pi * x[0] * x[0], params=1)
# train the network
net.fit(x[0], x[1], y)
# explain the network
m = net.explain(singleVariable=True)
print("f(x) = " + m.string(4))
# make prediction (pass two parameters if two variables are used)
p = net.predict(2.4)
# make predictions on array (pass two arrays if two variables are used)
p = net.predict([1.0, 2.3, 4.3])
To install Bacon-Net, along with the tools you need to develop and run tests, run the following command:
pip install -e .[dev]
To run all test cases, run the following command from the project's root folder:
pytest
Please see here for instructions on creating a new Bacon-Net network.
The idea behind Bacon-Net is simple: to construct a network that can do linear interpolation among a group of selected terms like min(x,y) and sin(x^2), as shown in the following diagram:
-
Input layer contains two variables. For gradient logic expressions.
-
Expansion layer defines the search space. Each node in this layer represents a candidate expression for the final approximation. Obviously, it’s desirable to have minimum overlaps among the function curves.
-
Interpolation layer creates a linear interpolation of candidate terms from the expansion layer by adjusting weights associated with candidates.
-
Aggregation layer calculate the interpolation result, which is compared to the training data.
It's also to feed the inputs to a family of Bacon-Net networks to search multiple expression spaces in parallel. A 1-active Selection layer is added on top to select the appropriate Bacon-Net in this case, as shown in the following diagram:
A Bacon-Stack is recursively defined: a Bacon-Stack that handles n variables (denoted as B(n)) is constructed by feeding variable x(i) and the result of a B(n-1) into a B(2) network, which is a Bacon-Net, as shown in the following diagram:
The following diagram illustrates how a 5-variable Bacon-Stack is implemented using Keras custom layers and multi-input features.
Bacon-Stack doesn't assume variables to be commutative. To explore permutation of variable orders, a Permutation layer is added at the bottom of the Bacon-Stack, as shown in the following diagram:
NOTE: Before the permutation layer is implemented, Bacon-Stack is constrained to handle expressions that can be rewrite as binary trees with ordered parameters. For example, the network will have difficulties to understand
(a+b)*c+d*e
, but will be fine with(a+b)*c+d+e
.
When I was in high school in encountered with a BASIC program that used a brute-force method to discover a arithmetical expression to approximate a given dataset. I remembered the program was called “BACON”. However, it’s been unfruitful to find such references in Internet, so my memory may have failed me. Regardless, I’ve been wanting to recreate “BACON” all these years, and I finally got around to do it just now during my week off.
As I research into explainable AI, I see an opportunity to combine “BACON” with AI so that we can build some precisely explainable AI networks, plus the benefit of implementing a parallelable, GPU-accelerated BACON using modern technologies.
-
Bacon-Poly3
For degree 3 polynomial expressions
-
Bacon-Trig2
For degree 2 trigonometic functions
-
Bacon-LSP6
Explainable gradient logic network for decision making
-
Bacon-CNN
Explainability layer on top of a CNN network
-
Bacon-H1
A combination of selected Bacon-Net networks
-
Bacon-Cal1
A simple calculus solver
- Twitter: @HaishiBai2010
- LinkedIn: Haishi Bai