This package is built on pytorch to avoid some standard steps
Project description
simpleTorch
This repo aims to train neural networks with pytorch simple. The data are normalized insede the class. The validation is performed automatically. The output model gets the unscaled data and returns the output unscaled (The scaling is performed inside). This way the user do not interact the scaling,; however the user can select to not use the default scaling and scale the data before the training
Installation
Use the package manager pip to install foobar.
pip install simpleTorch
Usage
A more detailed example in example_of_usage.ipynb
# X array with inputs in np
# F labels in np
# model is the neural net written in pytorch
import simpleTorch.train_ann
train_ann(model, X, F,plot=True)
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for simpleTorch-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 647bf963acdee67b58e52b8504ef812722d44c42ce68f1577a8849481ed0c6e8 |
|
MD5 | a406bb82922211c6e797fe94ac9c1e92 |
|
BLAKE2b-256 | d51ee42fcaedb7670a4619ad92c908531a49aaf2459c601bfa52de8ecb463f7a |