Skip to main content

A libirary to automate developing keras models

Project description

PyPi Library :- dl-assistant Documentation

Table of Contents

Installation

It can be easily installed by typing the following in the terminal:

pip install dl-assistant

Usage - Image classification

  1. create_dataframe- This function takes dir containing training or test data and convert it into pandas dataframe that is 1st step towards deep learning

    Simple Usage

                            from dl_assistant.image_cnn
                            import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            #Continue on your own
    
  2. prep_x_train- This function takes image paths, height and width to resize and returns an array of features of training data. It itself divide features by 255.0

    Simple Usage

                            from dl_assistant.image_cnn import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            x_train = x.prep_x_train(df['Images',100,100])# get features divided by 255.0 
    
  3. prep_y_train- This function takes labels and number of classes as input and converts labels into binary metrics.

    Simple Usage

                            from dl_assistant.image_cnn import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            x_train = x.prep_x_train(df['Images',100,100])
                            y_train = x.prep_y_train(df['Labels'],7)
    
  4. make_model- This function takes unit of 1 stconv2D layer and number of classes as input and returns a basic non-trained model.

    Simple Usage

                            from dl_assistant.image_cnn import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            x_train = x.prep_x_train(df['Images',100,100])
                            y_train = x.prep_y_train(df['Labels'],7)
                            shape=x_train[0].shape
                            model = x.make_model(128,7,shape)
    
  5. expert_make_model- This is a function developed for expert.

    • Syntax =expert_make_model(self,layers,unit,num_classes,input_shape,dr)

      • layers= This is a string list that can contain either Conv2D,MaxPooling2D or Dropout
      • unit= This is the integer that contains the unit of 1st Conv2D layer
      • num_classes= This is the count of no. of labels
      • input_shape= This is the input shape or shape of x_train[0]
      • dr= The dropout rate

    Simple Usage

                            from dl_assistant.image_cnn import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            x_train = x.prep_x_train(df['Images',100,100])
                            y_train = x.prep_y_train(df['Labels'],7)
                            shape=x_train[0].shape
                            model = x.expert_make_model(['Conv2D','Conv2D','MaxPooling2D','Dropout'],128,7,shape,0.5)
    
  6. train_model- This function takes model to train, x_train , y_train, epochs and batch_size and returns a trained model

    Simple Usage

                            from dl_assistant.image_cnn import classification
                            x=classification()
                            df=x.create_dataframe('train') # It will create a dataframe getting a datframe with image address in one column and label in another
                            x_train = x.prep_x_train(df['Images',100,100])
                            y_train = x.prep_y_train(df['Labels'],7)
                            shape=x_train[0].shape
                            model = x.make_model(128,7,shape)
                            batch_size=32
                            epochs=70
                            model.x.train_model(model,x_train,y_train,batch_size,epochs)
    

Examples

Create a emotion-classification model

            `from dl_assistant.image_cnn import classification                      model = classification()                     model = model.create('TRAIN',7)                     model.predict([x_test[0]])`

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dl-assistant-0.0.6.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

dl_assistant-0.0.6-py3-none-any.whl (5.0 kB view details)

Uploaded Python 3

File details

Details for the file dl-assistant-0.0.6.tar.gz.

File metadata

  • Download URL: dl-assistant-0.0.6.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.0rc2

File hashes

Hashes for dl-assistant-0.0.6.tar.gz
Algorithm Hash digest
SHA256 e6a4b65835dbd32644c256eca73bfb9548af2d833a828b60677c7aa9acb431f2
MD5 d9e96edc7815815aeb6cca8ca2e16aee
BLAKE2b-256 e61da4401a59302d968d4245a76b9e872e31c7499b129dc9c10ef106d0cb5a78

See more details on using hashes here.

File details

Details for the file dl_assistant-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: dl_assistant-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 5.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.0rc2

File hashes

Hashes for dl_assistant-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 0a7a2659cb1bdce9abb1d5b2d8407dd47ba621c50f3c4d9005af95f9700f4605
MD5 46b987fd77e109e3709eeead8a99c621
BLAKE2b-256 371407b905afe4883c253d8dc6cdbc657b7b7bc86b31656aa337495ea534c4d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page