Skip to main content

Lazily loading mixed sequences using Keras Sequence, focused on multi-task models.

Project description

Travis CI build SonarCloud Quality SonarCloud Maintainability Codacy Maintainability Maintainability Pypi project Pypi total project downloads

Lazily loading mixed sequences using Keras Sequence, focused on multi-task models.

How do I install this package?

As usual, just download it using pip:

pip install keras_mixed_sequence

Tests Coverage

Since some software handling coverages sometimes get slightly different results, here’s three of them:

Coveralls Coverage SonarCloud Coverage Code Climate Coverate

Usage examples

Example for traditional single-task models

First of all let’s create a simple single-task model:

from tensorflow.keras.layers import Dense
from tensorflow.keras.models import Sequential

model = Sequential([
    Dense(1, activation="relu")
])
model.compile(
    optimizer="nadam",
    loss="relu"
)

Then we proceed to load or otherwise create the training data. Here there will be listed, in the future, some custom Sequence objects that have been created for the purpose of being used alongside this library.

X = either_a_numpy_array_or_sequence_for_input
y = either_a_numpy_array_or_sequence_for_output

Now we combine the training data using the MixedSequence object.

from keras_mixed_sequence import MixedSequence

sequence = MixedSequence(
    X, y,
    batch_size=batch_size
)

Finally, we can train the model:

from multiprocessing import cpu_count

model.fit_generator(
    sequence,
    steps_per_epoch=sequence.steps_per_epoch,
    epochs=2,
    verbose=0,
    use_multiprocessing=True,
    workers=cpu_count(),
    shuffle=True
)

Example for multi-task models

First of all let’s create a simple multi-taks model:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, Input

inputs = Input(shape=(10,))

output1 = Dense(
    units=10,
    activation="relu",
    name="output1"
)(inputs)
output2 = Dense(
    units=10,
    activation="relu",
    name="output2"
)(inputs)

model = Model(
    inputs=inputs,
    outputs=[output1, output2],
    name="my_model"
)

model.compile(
    optimizer="nadam",
    loss="MSE"
)

Then we proceed to load or otherwise create the training data. Here there will be listed, in the future, some custom Sequence objects that have been created for the purpose of being used alongside this library.

X = either_a_numpy_array_or_sequence_for_input
y1 = either_a_numpy_array_or_sequence_for_output1
y2 = either_a_numpy_array_or_sequence_for_output2

Now we combine the training data using the MixedSequence object.

from keras_mixed_sequence import MixedSequence

sequence = MixedSequence(
    x=X,
    y={
        "output1": y1,
        "output2": y2
    },
    batch_size=batch_size
)

Finally, we can train the model:

from multiprocessing import cpu_count

model.fit_generator(
    sequence,
    steps_per_epoch=sequence.steps_per_epoch,
    epochs=2,
    verbose=0,
    use_multiprocessing=True,
    workers=cpu_count(),
    shuffle=True
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

keras_mixed_sequence-1.0.26.tar.gz (6.9 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page