Skip to main content

This is a hgih level API for data visualization. It provides the static and interactive vi

Project description

OneFlow

It is a Package having the Implementation of ANN with callbacks

🔗 Project Link

Check out the Pypi Package here

Run Locally

Create two files in your working directory:

  • config.yaml
  • training.py

config.yaml

params:
  epochs : 3
  batch_size : 32
  num_classes : 10
  input_shape : [28, 28]
  loss_function : sparse_categorical_crossentropy
  metrics : accuracy
  optimizer : SGD
  validation_datasize : 5000
  es_patience : 5

artifacts:
  artifacts_dir : artifacts
  model_dir : model
  plots_dir : plots
  model_name : model.h5
  plot_name : results_plot.png
  model_ckpt_dir : ModelCheckpoints
  callbacked_model_name : model_ckpt.h5

logs:
  logs_dir : logs_dir
  general_logs : general_logs
  tensorboard_root_log_dir : tensorboard_logs

training.py

from OneFlow.utils.common import read_config
from OneFlow.utils.data_mgmt import get_data
from OneFlow.utils.model import StepFlow
import argparse, os 

def training(config_path):
    config = read_config(config_path)
    validation_datasize = config["params"]["validation_datasize"]
    #This "get_data" function is loading the mnist dataset, bring your own and divide into categories to perform the custom training
    (X_train, y_train), (X_valid, y_valid), (X_test, y_test) = get_data(validation_datasize)
    sp = StepFlow(config, X_train, y_train, X_valid, y_valid)
    sp.create_model()
    sp.fit_model()
    sp.save_final_model()
    sp.save_plot()

if __name__ == "__main__":
    args = argparse.ArgumentParser()
    args.add_argument("-c", "--config", default="config.yaml")

    parsed_args = args.parse_args()
    training(config_path = parsed_args.config)

Then run the following commands on the termial

pip install OneFlow-Hassi34
python training.py
On completion of training, run the following command on termial and observe the metrics on tensorboard
tensorboard --logdir=logs_dir/tensorboard_logs/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vizpool-0.0.1.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vizpool-0.0.1-py3-none-any.whl (14.4 kB view details)

Uploaded Python 3

File details

Details for the file vizpool-0.0.1.tar.gz.

File metadata

  • Download URL: vizpool-0.0.1.tar.gz
  • Upload date:
  • Size: 14.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.13

File hashes

Hashes for vizpool-0.0.1.tar.gz
Algorithm Hash digest
SHA256 e39371bef684ebec4069bf6e5a09424544a6d433ef2d39630ca51ed7dde107ec
MD5 7a93b3e7c20f271990c531d98f3cdc49
BLAKE2b-256 2b981a988d84245a620cd8b24d710a4d510c5b498b13f83bc009491f07dbbad6

See more details on using hashes here.

File details

Details for the file vizpool-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: vizpool-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 14.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.13

File hashes

Hashes for vizpool-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f9ad1e4ed1d9862e63240a0af31fb62f9640974d9319f654be27892a8d293cef
MD5 bfdafe8f85bb4ec6d46a90621ac59b0f
BLAKE2b-256 32fc8c0cd325d212366404536c7e50bfc68fc776c0b4c7ea6829faff88d11734

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page