Basenet API: A simpler way to build ML models.
Project description
BaseNet: A simpler way to build AI models.
Basenet API Package - 1.5.4
This package implements an API over Keras and Tensorflow to build Deep Learning models easily without losing the framework flexibility. BaseNet API tries to implement almost everything from a few lines of code.
About
Author: A.Palomo-Alonso (a.palomo@uah.es)
Universidad de Alcalá.
Escuela Politécnica Superior.
Departamento de Teoría De la Señal y Comunicaciones (TDSC).
ISDEFE Chair of Research.
Features
- Feature 1: Real-time logging.
- Feature 2: Database train, validation and test automatic and random segmentation.
- Feature 3: Real multiprocessing training process.
- Feature 4: Automatic and custom GPU usage.
- Feature 5: Easy-to-use classes.
- Feature 6: Model merging.
- Feature 7: Multiple model inputs.
- Feature 8: API documentation.
- Feature 9: Python Packaging and PyPi indexing.
- Feature 10: Automatic GPU configuration and assignment.
Basic and fast usage
BaseNetDataset
BaseNetDatabase is an easy-to-use database wrapper for the API. You can build your database with the BaseNetDatabase class.
Example of building a BaseNetDataset.
from basenet_api import BaseNetDatabase
my_data_x, my_data_y = load_my_data()
print(my_data_y)
# > array([[0.], [1.], ...], dtype=float32)
print(my_data_x)
# > array([[255., 0., 255., ..., dtype=float32)
distribution = {'train': 60, 'test': 5, 'val': 35}
mydb = BaseNetDatabase(my_data_x, my_data_y,
distribution=distribution)
print(mydb)
# > BaseNetDatabase with 32000 instances.
mydb.save('./mydb.db')
BaseNetCompiler
BaseNetCompiler takes the model architecture and builds a BaseNetModel with the given parameters. You can build your BaseNetCompiler from Python code only or a .yaml file.
Example of building a BaseNetCompiler from Python code only.
from basenet_api import BaseNetDatabase, BaseNetCompiler
mydb = BaseNetDatabase.load('./mydb.db')
print(mydb)
# > BaseNetDatabase with 32000 instances.
layers = [
{'Dense': ((255,), {})},
{'Dense': ((64,), {'activation': 'relu'})},
{'Dropout': ((0.5,), {})}
]
my_devs = BaseNetCompiler.show_devs()
print(my_devs)
# > {'/device:CPU:0': 'Idle',
# '/device:GPU:0': 'Train'}
my_first_model = BaseNetCompiler(
io_shape=((8,), 8),
compile_options={'loss': 'mean_squared_error', 'optimizer': 'adam'},
devices=my_devs,
layers=layers,
name='my_first_model'
).compile()
my_first_model.add_database(mydb)
You can also use the BaseNetModel.add() method to add layers.
my_first_compiler = BaseNetCompiler(
io_shape=((8,), 8),
compile_options={'loss': 'mean_squared_error', 'optimizer': 'adam'},
devices=my_devs,
name='my_first_model'
)
for layer in layers:
my_first_compiler.add(layer)
my_first_model = my_first_compiler.compile()
You can also load the database from the path.
my_fitst_model.add_database('./mydb.db')
Example of building a BaseNetCompiler from .yaml file.
Suppose you have a .yaml
file in the ./my_model.yaml
location with
the proper format you can load your compiler with the method BaseNetCompiler().build_from_yaml(yaml_path)
and omit the process of loading the parameters into the compiler manually.
from basenet_api import BaseNetDatabase, BaseNetCompiler
mydb = BaseNetDatabase.load('./mydb.db')
print(mydb)
# > BaseNetDatabase with 32000 instances.
yaml_path = './my_model.yaml'
my_first_model = BaseNetCompiler.build_from_yaml(yaml_path).compile()
my_first_model.add_database(mydb)
An example of .yaml
to replicate the same model as in the section
Building a BaseNetCompiler from Python code only.
, the .yaml
file will be the following:
compiler:
name: "my_first_model"
input_shape:
- 8
output_shape: 8
compile_options:
loss: "mean_squared_error"
optimizer: "adam"
devices:
- cpu:
name: "/device:CPU:0"
state: "Idle"
- gpu:
name: "/device:GPU:0"
state: "Train"
layers:
- layer:
name: "Dense"
shape:
- 128
options:
- layer:
name: "Dense"
shape:
- 128
options:
- option:
name: "activation"
value: "relu"
- layer:
name: "Dropout"
shape:
- 0.5
options:
If you want to learn more about building a model from a .yaml
file, please, check the API
documentation.
Example of usage of the BaseNetModel.
Once you build and compile a BaseNetModel
with a BaseNetCompiler.compile()
method, you can make use of all the
methods that the BaseNetModel provides:
BaseNetModel.load()
: This method loads a tf.keras.model and the compiler from the given path.BaseNetModel.save()
: This method saves a tf.keras.model and the compiler into the given path.BaseNetModel.print()
: This method renders a.png
image of the model into the given path.BaseNetModel.add_database()
: TheBaseNetModel
contains a breech of databases. It is a list with all the loaded databases previously. This method adds a database from a path or from aBaseNetDatabase
object.BaseNetModel.predict()
: Performs a prediction given an input.BaseNetModel.evaluate()
: Evaluates the model with the pointed database test subset.BaseNetModel.fit()
: Trains the model with the pointed database.BaseNetModel.call()
: Merges two models into one. It can be used as a function.
Printing and fitting a model.
from basenet_api import BaseNetDatabase, BaseNetCompiler
mydb = BaseNetDatabase.load('./mydb.db')
my_first_model = BaseNetCompiler.build_from_yaml('./my_model.yaml').compile()
my_first_model.add_database(mydb)
# Select database with index 0.
my_first_model.fit(0, epochs=6, tensorboard=False)
# > Tensorflow fitting info vomiting.
# Print the model.
my_first_model.print('./my_model.png')
Fitting a model in other process.
Important: Debugging is not working properly when fitting a new process.
Imagine working on a GUI. The training process of your model implemented on your
GUI will block the parent process. The API implements a solution. Just activate
avoid_lock=True
in the BaseNetModel.fit()
method and check the results whenever you want.
from basenet_api import BaseNetDatabase, BaseNetCompiler
mydb = BaseNetDatabase.load('./mydb.db')
my_first_model = BaseNetCompiler.build_from_yaml('./my_model.yaml').compile()
my_first_model.add_database(mydb)
# Select database with index 0.
my_results = my_first_model.fit(0, epochs=6, tensorboard=False, avoid_lock=True)
while my_results.is_training:
do_my_main_activity(update_gui, collect_data, run_server, or_whatever)
current_loss_curve = my_results.get()
# my_first_model.recover() Use it in versions < 1.5.0.
keep_doing_my_main_activity(update_gui, collect_data, run_server, or_whatever)
OutDated
:
Note that if you don't make use of the method BaseNetModel.recover()
the model will be empty as
the trained model is bypassed by the child process until the parent process is able to recover the trained model.
From >= 1.5.0
: The model recovers itself, there is no need (or ways) to recover it manually.
Using Tensorboard.
The API also implements Tensorboard automatic opening and initialization. You can see the training process and keras app in real time while training.
my_first_model.fit(0, epochs=6, tensorboard=True)
Merging two models into one with several inputs.
You can merge two BaseNetModels by calling the object as a function:
from basenet_api import BaseNetDatabase, BaseNetCompiler
mydb = BaseNetDatabase.load('./mydb.db')
my_first_model = BaseNetCompiler.build_from_yaml('./my_model.yaml', verbose=True).compile()
my_second_model = BaseNetCompiler.build_from_yaml('./my_model_2.yaml', verbose=True).compile()
my_first_model.add_database(mydb)
my_first_model(my_second_model, parallel=True, name='merged_model')
my_first_model.print('./')
It will merge the two models into one single with two outputs if parallel=True
, else it will be added at the bottom.
Obtaining training results from the fitting process.
Once you train the model, you can get a BaseNetResults
object with the training results. You can obtain the values from:
my_results = my_first_model.fit(0, epochs=6)
losses = my_results.get()
print(losses)
# > {'loss': [1., 0.7, 0.6, 0.5, 0.4, 0.3],
# 'val_loss': [1., 0.8, 0.7, 0.6, 0.5, 0.4]}
What's new?
< 0.1.0
- BaseNetModel included.
- BaseNetDatabase included.
- BaseNetCompiler included.
- Inheritance from CorNetAPI project.
- Multi-processing fitting.
- Tensorboard launching.
0.2.0
- BaseNetResults included (working).
- Now the model is callable.
- Switched print to logging.
- Project documentation.
1.0.0 - 1.0.3
- Python packaging
- 1.0.x: Upload bug solving.
1.1.0
- Functional package.
- PyPi indexing.
1.2.0:
- Loss results included in the BaseNetResults while multiprocessing.
- GPU auto set up to avoid TensorFlow memory errors.
- Method
BaseNetCompiler.set_up_devices()
configures the GPUs according to the free RAM to be used in the API.
1.3.0
- Included WindowDiff to the project scope.
1.4.0
- Solved python packaging problems.
- Included force stop callback in the
BaseNetModel.fit_stop()
method.
1.5.0
- BaseNetDatabase now has the attributes
BaseNetDatabase.size
andBaseNetDatabase.distribution
. - Solved forced stopping bugs with multiprocessing in the method
BaseNetDatabase.fit_stop()
. BaseNetModel._threshold()
private method now takes a set of outputs instead only one. This was only for optimization.- Solved wrong
BaseNetModel.recover()
. - Auto recover implemented, now
BaseNetModel.recover()
is a private method:BaseNetModel._recover()
. Now the used does not need to recover it. The model recovers by itself. -- Hans Niemann 2022. NOTE: RECOVER IS NECESARY WHEN THE MODEL IS EARLY STOPPED; CONSIDER RECOVERING ALWAYS THE MODEL.
1.5.1 - 1.5.3
- Solved a bug where
BaseNetDatabase
modified the incoming list of instances in the database; avoiding checkpoints for large database generators. - Exception handler for
nvml
library if NVIDIA Drivers are not installed`in the machine.
1.5.4
- Added some
BaseNetDatabase
utils: merge and split databases. - Added
BaseNetDatabase
equality check. - Added a
BaseNetDatabase._reversion()
,BaseNetCompiler._reversion()
andBaseNetModel.__version__
. Which rebuilds the Classes to the current version of the API.
Cite as
Please, cite this library as:
@misc{basenetapi,
title={CorNet: Correlation clustering solving methods based on Deep Learning Models},
author={A. Palomo-Alonso},
booktitle={PhD in TIC: Machine Learning and NLP.},
year={2022}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file basenet_api-1.5.4.tar.gz
.
File metadata
- Download URL: basenet_api-1.5.4.tar.gz
- Upload date:
- Size: 771.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 98202e57e45001aa2550883caf0a4d08b94c9d2d17ce212e269fc6f161e90b7c |
|
MD5 | 190623589794aa6bea57626afc91b9f8 |
|
BLAKE2b-256 | a1d342ef2b025a63a06bfefdf859a9d88d88b2bf63634129b26316aa3389a4e2 |
File details
Details for the file basenet_api-1.5.4-py3-none-any.whl
.
File metadata
- Download URL: basenet_api-1.5.4-py3-none-any.whl
- Upload date:
- Size: 775.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8daf580ef89e1b4390da9c73e74930158945749fe5221afe9e2db97fb4260c6f |
|
MD5 | a12f02667e6ec451e797999415c2e844 |
|
BLAKE2b-256 | 399aa435d139d1a74919e71a3a56880eb8a28b8ba634e3893c94c843a16216e6 |