Skip to main content

Feature Selection using Genetic Algorithm (DEAP Framework)

Project description

FeatureSelectionGA

Feature Selection using Genetic Algorithm (DEAP Framework)

Data scientists find it really difficult to choose the right features to get maximum accuracy especially if you are dealing with a lot of features. There are currenlty lots of ways to select the right features. But we will have to struggle if the feature space is really big. Genetic algorithm is one solution which searches for one of the best feature set from other features in order to attain a high accuracy.

Installation:

$ pip install feature-selection-ga

Usage:

from sklearn.datasets import make_classification
from sklearn import linear_model
from feature_selection_ga import FeatureSelectionGA, FitnessFunction

X, y = make_classification(n_samples=100, n_features=15, n_classes=3,
                           n_informative=4, n_redundant=1, n_repeated=2,
                           random_state=1)

model = linear_model.LogisticRegression(solver='lbfgs', multi_class='auto')
fsga = FeatureSelectionGA(model,X,y, ff_obj = FitnessFunction())
pop = fsga.generate(100)

#print(pop)

Usage (Advanced):

By default, the FeatureSelectionGA has its own fitness function class. We can also define our own FitnessFunction class.

class FitnessFunction:
    def __init__(self,n_splits = 5,*args,**kwargs):
        """
            Parameters
            -----------
            n_splits :int,
                Number of splits for cv

            verbose: 0 or 1
        """
        self.n_splits = n_splits

    def calculate_fitness(self,model,x,y):
        pass

With this, we can design our own fitness function by defining our calculate fitness! Consider the following example from Vieira, Mendoca, Sousa, et al. (2013) $f(X) = \alpha(1-P) + (1-\alpha) \left(1 - \dfrac{N_f}{N_t}\right)$

Define the constructor init with needed parameters: alpha and N_t.

class FitnessFunction:
    def __init__(self,n_total_features,n_splits = 5, alpha=0.01, *args,**kwargs):
        """
            Parameters
            -----------
            n_total_features :int
            	Total number of features N_t.
            n_splits :int, default = 5
                Number of splits for cv
            alpha :float, default = 0.01
                Tradeoff between the classifier performance P and size of
                feature subset N_f with respect to the total number of features
                N_t.

            verbose: 0 or 1
        """
        self.n_splits = n_splits
        self.alpha = alpha
        self.n_total_features = n_total_features

Next, we define the fitness function, the name has to be calculate_fitness:

    def calculate_fitness(self,model,x,y):
        alpha = self.alpha
        total_features = self.n_total_features

        cv_set = np.repeat(-1.,x.shape[0])
        skf = StratifiedKFold(n_splits = self.n_splits)
        for train_index,test_index in skf.split(x,y):
            x_train,x_test = x[train_index],x[test_index]
            y_train,y_test = y[train_index],y[test_index]
            if x_train.shape[0] != y_train.shape[0]:
                raise Exception()
            model.fit(x_train,y_train)
            predicted_y = model.predict(x_test)
            cv_set[test_index] = predicted_y

        P = accuracy_score(y, cv_set)
        fitness = (alpha*(1.0 - P) + (1.0 - alpha)*(1.0 - (x.shape[1])/total_features))
        return fitness

Example: You may also see example2.py

X, y = make_classification(n_samples=100, n_features=15, n_classes=3,
n_informative=4, n_redundant=1, n_repeated=2,
random_state=1)

# Define the model

model = linear_model.LogisticRegression(solver='lbfgs', multi_class='auto')

# Define the fitness function object

ff = FitnessFunction(n_total_features= X.shape[1], n_splits=3, alpha=0.05)
fsga = FeatureSelectionGA(model,X,y, ff_obj = ff)
pop = fsga.generate(100)

Example adopted from pyswarms

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

feature-selection-ga-0.1.2.tar.gz (10.9 kB view details)

Uploaded Source

Built Distribution

feature_selection_ga-0.1.2-py2.py3-none-any.whl (7.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file feature-selection-ga-0.1.2.tar.gz.

File metadata

  • Download URL: feature-selection-ga-0.1.2.tar.gz
  • Upload date:
  • Size: 10.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.8.5

File hashes

Hashes for feature-selection-ga-0.1.2.tar.gz
Algorithm Hash digest
SHA256 6dbbcdb297fc9884e90f17e366af1a855ccb72221bfe5ab377d5838de39b3e58
MD5 28ee134cfdfebdb7033a746c729eef2f
BLAKE2b-256 2e31819770f04eb5e61a11804b3c4ca7ee24789e16a11b7141aa071dbd2c6c91

See more details on using hashes here.

File details

Details for the file feature_selection_ga-0.1.2-py2.py3-none-any.whl.

File metadata

  • Download URL: feature_selection_ga-0.1.2-py2.py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.50.0 CPython/3.8.5

File hashes

Hashes for feature_selection_ga-0.1.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 e94e63b402c84d115c16d9f8a617c19d29d8c88ad70376389b139ecf52d5d1e1
MD5 4a4dee4b02057225355cb73102a64924
BLAKE2b-256 31abcf4a09e0d6cf9d57d5326d40534cebcb2b6a95d66bd77f589ff95ef7966d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page