Batching is a set of tools to format data for training sequence models
Project description
Batching
Batching is a set of tools to format data for training sequence models.
Installation
$ pip install batching
Example usage
Example script exists in sample.py
# Metadata for batch info - including batch IDs and mappings to storage resouces like filenames
storage_meta = StorageMeta(validation_split=0.2)
# Storage for batch data - Memory, Files, S3
storage = BatchStorageMemory(storage_meta)
# Create batches - configuration contains feature names, windowing config, timeseries spacing
batch_generator = Builder(storage,
feature_set,
look_back,
look_forward,
batch_seconds,
batch_size=128)
batch_generator.generate_and_save_batches(list_of_dataframes)
# Generator for feeding batches to training - tf.keras.model.fit_generator
train_generator = BatchGenerator(storage)
validation_generator = BatchGenerator(storage, is_validation=True)
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, activation='sigmoid')
model.compile(loss=tf.keras.losses.binary_crossentropy,
optimizer=tf.keras.optimizers.Adam(),
metrics=['accuracy'])
model.fit_generator(train_generator,
validation_data=validation_generator,
epochs=epochs)
License
- MIT license
- Copyright 2015 © FVCproductions.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
batching-1.0.6.tar.gz
(8.5 kB
view details)
File details
Details for the file batching-1.0.6.tar.gz.
File metadata
- Download URL: batching-1.0.6.tar.gz
- Upload date:
- Size: 8.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.1 requests-toolbelt/0.9.1 tqdm/4.39.0 CPython/3.7.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
037497ad0f89c0a83a8c8e31b2544cd2369a668e3d3ec15c352f61812458f8f5
|
|
| MD5 |
0aff794d63ccdf29d0ffb8c511c5ad2d
|
|
| BLAKE2b-256 |
985f980241c1bfdcf98315c7a31f10e24ac9fc5f5232839f31587dfcf42066b4
|