A simple numpy based neural network library inspired by Tensorflow/Keras.
Project description
<img id="logo" src="https://github.com/OmPanchal/Bren/blob/main/bren/B.png" ></img>
bren is a custom numpy based library, powered by automatic differentiation, inspired by Tensorflow/Keras, which allows users to build small scale simple neural networks. It's analogous yet simpler design to the Keras API allows users to produce, train and save their own models, with custom components, without having to learn an entirely new structure.
bren is part of a sequence of neural network from scratch projects and a successor to the neural-network-from-scratch-v2, with one major update being the integration of automatic differentitation. Automatic differentiation allows for the real-time determination of derivatives during backpropagation (through the use of computation graphs produced by br.autodiff and br.Variable) and reduces the need for users to couple mathematical computation with pre-written derivatives as was required in the previous projects.
Install
To install the latest version of bren, run:
pip install bren
Your first bren program. (Examples tend to import bren as br)
import bren as br
A = br.Variable([1, 2, 3])
print(A + 2) # <Variable value=[3. 4. 5.] dtype=float64>
br.autodiff (Automatic Differentiation)
bren is an automatic differentiation driven neural network library, with backpropagation making use of br.Graph to find the derivatives of the trainable parameters with respect to the loss. This is governed by br.Variable which keeps track of any operation which have been performed on the Variable object. br.autodiff is used to produce a computation graph of these recorded operations, this graph can be back tracked to determine the derivatives of a given dy value with respect to a given dx value. As a gradient, a br.Constant is returned (the derivative of a br.Constant is always 0). The functionings of br.Variable and br.Constant are a result of the swift array computation of numpy.
import bren as br
A = br.Variable([1, 2, 3], dtype="float64")
with br.Graph() as g:
B = A ** 2 # any computation performed on a Variable will be tracked
print(g.grad(B, [A])) # [<Constant value=[2. 4. 6.] dtype=float64>]
br.nn (Neural Networks)
bren's modular design is heavily inspired by the Keras API, allowing users to produce networks comprised of a variety of customisable components. Users are also capable of producing their own custom components such as (layers, activations, initialisers, losses, metrics) through the use of the respective base classes. br.nn allows ready made components to be imported and custom components to be produced.
Your first neural network with bren: First prepare the dataset
import bren as br
# Test data - The XOR dataset
X = br.Variable([[0, 0], [0, 1], [1, 0], [1, 1]])
Y = br.Variable([0, 1, 1, 0])
Next initialise the Sequential model which will be trained to fit the data. It will take in a list of layers through which the data will be passed through consecutively. In the example below, a three fully connected (FC) layer neural network is initialised, with activation "tanh".
# Initialise the model
model = br.nn.models.Sequential(layers=[
br.nn.layers.FC(2, activation="tanh"),
br.nn.layers.FC(32, activation="tanh"),
br.nn.layers.FC(1, activation="tanh"),
])
model.assemble will then be called to establish the optimiser used during gradient descent, the loss function, and metrics to be displayed during training.
model.assemble(
optimiser="Adam",
loss=br.nn.losses.MeanSquaredError(),
metrics=["accuracy"]
)
To train the model, model.fit will be called. Here the training features and lables will be specified, as well other arguments such as epochs (the number of iterations of the data while training the metwork).
model.fit(X, Y, epochs=100)
Now that the model is trained, you can test it on test data to evaluate its performance, and also save it to a file for future use.
pred = model.predict(X) # Test data would be inputted
model.save("model")
Custom Components
Custom components can be produced by using the base class of the component or can be in the form of a function, both methods are compatible with the existing components which br.nn provides.
# Custom activation as a function
def linear(x): return x
# Custom activation as a class
class Linear(br.nn.activations.Activation):
def __init__(self, name="linear", **kwargs):
super().__init__(linear, name, **kwargs)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bren-0.1.5.tar.gz.
File metadata
- Download URL: bren-0.1.5.tar.gz
- Upload date:
- Size: 33.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ec350d35ebe38a221b2b86d79b58729efc7a1ba90a5f27a823406b0c12854b3c
|
|
| MD5 |
91720ab1c217e8a0997bd5b94f0a2573
|
|
| BLAKE2b-256 |
7782e4c63d4299bd3d55c6adc02e578803b60668229ec88c83284224246891c7
|
File details
Details for the file bren-0.1.5-py3-none-any.whl.
File metadata
- Download URL: bren-0.1.5-py3-none-any.whl
- Upload date:
- Size: 46.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ede8f1f4f25d26415c851e8953b485a2503e65fe247c6f9d1aa904395ef6abae
|
|
| MD5 |
0b71086b7849c57f55983a83aee7b788
|
|
| BLAKE2b-256 |
891990e3ac17e461e939b8a3e42330ec640bc409e8a1dd5b6cab8c58c2901d48
|