A helper package for hdf5 data handling
Project description
lose
lose, but in particular lose.LOSE()
, is a helper class for handling data using hdf5
file format and PyTables
>>> from lose import LOSE
>>> l = LOSE()
>>> l
<lose hdf5 data handler, fname=None, atom=Float32Atom(shape=(), dflt=0.0)>
generator parameters: iterItems=None, iterOutput=None, batch_size=1, limit=None, loopforever=False, shuffle=False
installation
pip3 install -U lose
or
pip install -U lose
structure
vars
LOSE.fname
is the path to to the .h5
file including the name and extension, default is None
.
LOSE.atom
recommended to be left at default, is the dtype
for the data to be stored in, default is tables.Float32Atom()
which results to arrays with dtype==np.float32
.
LOSE.batch_obj
default is '[:]'
, recommended to be left default, specifies the amount of data to be loaded by LOSE.load()
, works like python list slicing, must be a string, default loads everything.
LOSE.generator()
related vars:
LOSE.batch_size
batch size of data getting pulled from the .h5
file, default is 1
.
LOSE.limit
limits the amount of data loaded by the generator, default is None
, if None
all available data will be loaded.
LOSE.loopforever
bool that allows infinite looping over the data, default is False
.
LOSE.iterItems
list of X group names and list of Y group names, default is None
, required to be user defined for LOSE.generator()
to work.
LOSE.iterOutput
list of X output names and list of Y output names, default is None
, required to be user defined for LOSE.generator()
to work.
LOSE.shuffle
bool that enables shuffling of the data, default is False
, shuffling is affected by LOSE.limit
and LOSE.batch_size
.
methods
Help on LOSE in module lose.dataHandler object:
class LOSE(builtins.object)
| Methods defined here:
|
| __init__(self)
| Initialize self. See help(type(self)) for accurate signature.
|
| __repr__(self)
| Return repr(self).
|
| __str__(self)
| Return str(self).
|
| generator(self)
|
| getShape(self, arrName)
|
| load(self, *args)
|
| makeGenerator(self, layerNames, limit=None, batch_size=1, shuffle=False, **kwards)
|
| newGroup(self, fmode='a', **kwards)
|
| removeGroup(self, *args)
|
| save(self, **kwards)
|
| ----------------------------------------------------------------------
LOSE.newGroup(fmode='a', **groupNames)
is used to append/write(depends on the fmode
keyword argument, default is 'a'
) group(s) to a .h5
file.
LOSE.removeGroup(*groupNames)
is used for to remove group(s) from a file, provided the group(s) name.
LOSE.save(**groupNamesAndSahpes)
is used to save data(in append mode only) to a group(s) into a .h5
file, the data needs to have the same shape as group.shape[1:]
the data was passed to, LOSE.get_shape(groupName)
can be used to get the group.shape
.
LOSE.load(*groupNames)
is used to load data(hole group or a slice, to load a slice change LOSE.batch_obj
to a string with the desired slice, default is "[:]"
) from a group, group has to be present in the .h5
file.
LOSE.getShape(groupName)
is used to get the shape of a single group, group has to be present in the .h5
file.
LOSE.generator()
check LOSE.generator() details
section, LOSE.iterItems
and LOSE.iterOutput
have to be defined.
LOSE.makeGenerator(layerNames, limit=None, batch_size=1, shuffle=False, **data)
again check LOSE.generator() details
more details.
example usage
creating/adding new group(s) to a file
import numpy as np
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to the .h5 file, has to be user defined before any methods can be used, default is None
exampleDataX = np.arange(20, dtype=np.float32)
exampleDataY = np.arange(3, dtype=np.float32)
l.newGroup(fmode='w', x=(0, *exampleDataX.shape), y=(0, *exampleDataY.shape)) # creating new groups(ready for data saved to) in a file, if fmode is 'w' all groups in the file will be overwritten
saving data to a group(s)
import numpy as np
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to the .h5 file, has to be user defined before any methods can be used, default is None
exampleDataX = np.arange(20, dtype=np.float32)
exampleDataY = np.arange(3, dtype=np.float32)
l.save(x=[exampleDataX, exampleDataX], y=[exampleDataY, exampleDataY]) # saving data into groups defined in the previous example
l.save(y=[exampleDataY], x=[exampleDataX]) # the same thing
loading data from a group(s) within a file
import numpy as np
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to the .h5 file, has to be user defined before any methods can be used, default is None
x, y = l.load('x', 'y') # loading data from the .h5 file(has to be a real file) populated by previous examples
y2compare, x2compare = l.load('y', 'x') # the same thing
print (np.all(x == x2compare), np.all(y == y2compare)) # True True
getting the shape of a group
import numpy as np
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to the .h5 file(populated by previous examples), has to be user defined before any methods can be used, default is None
print (l.getShape('x')) # (3, 20)
print (l.getShape('y')) # (3, 3)
removing group(s) from a file
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to the .h5 file(populated by previous examples), has to be user defined before any methods can be used, default is None
l.removeGroup('x', 'y') # removing the group(s)
x = l.load('x') # now this will result in an error because group 'x' was removed from the file
LOSE.generator()
details
LOSE.generator()
is a python generator used to access data from a hdf5
file in LOSE.batch_size
pieces without loading the hole file/group into memory, also works with tf.keras.model.fit_generator()
, have to be used with a with
context statement(see examples below).
LOSE.iterItems
and LOSE.iterOutput
have to be defined by user first.
LOSE.make_generator(layerNames, limit=None, batch_size=1, shuffle=False, **data)
has the same rules as LOSE.generator()
. however the data needs to be passed to it each time it's initialized, data is only stored temporarily, the parameters are passed to it on initialization, layerNames
acts like LOSE.iterOutput
and LOSE.iterItems
, but every name in it has to match to names of the data passed(see examples below), if file temp.h5
exists it will be overwritten and then deleted.
example LOSE.generator()
usage
for this example lets say that file has requested data in it and the model input/output layer names are present.
import numpy as np
from lose import LOSE
l = LOSE()
l.fname = 'path/to/you/save/file.h5' # path to data
l.iterItems = [['x1', 'x2'], ['y']] # names of X and Y groups, all group names need to have batch dim the same and be present in the .h5 file
l.iterOutput = [['input_1', 'input_2'], ['dense_5']] # names of model's layers the data will be cast on, group.shape[1:] needs to match the layer's input shape
l.loopforever = True
l.batch_size = 20 # some batch size, can be bigger then the dataset, but won't output more data, it will just loop over or stop the iteration if LOSE.loopforever is False
l.limit = 10000 # lets say that the file has more data, but you only want to train on first 10000 samples
l.shuffle = True # enable data shuffling for the generator, costs memory and time
with l.generator() as gen:
some_model.fit_generator(gen(), steps_per_epoch=50, epochs=1000, shuffle=False) # model.fit_generator() still can't shuffle the data, but LOSE.generator() can
example LOSE.make_generator(layerNames, limit=None, batch_size=1, shuffle=False, **data)
usage
for this example lets say the model's input/output layer names are present and shapes match with the data.
import numpy as np
from lose import LOSE
l = LOSE()
num_samples = 1000
x1 = np.zeros((num_samples, 200)) # example data for the model, x1.shape[1:] == model.get_layer('input_1').output_shape[1:]
x2 = np.zeros((num_samples, 150)) # example data for the model, x2.shape[1:] == model.get_layer('input_2').output_shape[1:]
y = np.zeros((num_samples, 800)) # example data for the model, y.shape[1:] == model.get_layer('dense_5').output_shape[1:]
with l.make_generator([['input_1', 'input_2'], ['dense_5']], batch_size=10, shuffle=True, input_2=x2, input_1=x1, dense_5=y) as gen:
del x1 #remove from memory
del x2 #remove from memory
del y #remove from memory
some_model.fit_generator(gen(), steps_per_epoch=100, epochs=10000, shuffle=False) # again data can't be shuffled by model.fit_generator(), shuffling should be done by the generator
bugs/problems/issues
report them.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lose-0.4.6.tar.gz
.
File metadata
- Download URL: lose-0.4.6.tar.gz
- Upload date:
- Size: 5.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d0eae3f942c3981b8d48c7453c35ab095995f673a6388be795af3ef3b3120643 |
|
MD5 | 8866d300815ace04e2f7e819b2e8504f |
|
BLAKE2b-256 | 35c943a819c67399848445f10d65640617ca546af441116dbb8c9a2567679247 |
File details
Details for the file lose-0.4.6-py3-none-any.whl
.
File metadata
- Download URL: lose-0.4.6-py3-none-any.whl
- Upload date:
- Size: 6.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 70a20f3ef90fbfa7b7a144d6ce468aa011f59f404eb42edc2ff8bf5462ad74f2 |
|
MD5 | c70860991204131d5fad7d6038f72dc7 |
|
BLAKE2b-256 | f0728ef4a4967b26bac0135262c3732fbda6a2cfa48b3d6d443ec04a82574dd7 |