What is Feature Selection and it’s techniques

July 27, 2018 Author: munishmishra04_3od47tgp
Print Friendly, PDF & Email

Suppose there is a real life problem as to put a nail on a wall , so what are the steps to complete the process of putting that nail , we need a nail and a hammer so this selection of attribute according to the problem is the same thing in the Feature selection in machine learning . The reduction of problem depends on the selection of the variables included in the process, and thus the feature selection plays an important role in the creation of a model of a problem.




This article covers some of the following topics:

  • Introduction
  • Techniques of Features Selection
  • Advantage and use of feature selection

Introduction of Feature selection

In machine learning Feature selection is a process of selecting the essential and useful variables for the particular problem model and feature selection is the process where the complexity and the performance of the model will be decided , as if the features or not selected properly , then model will be complex , slower and bulky and thus performance will be decreased. The feature selection compares and selects the best features that are relevant to the associated problem and removes the unwanted features according to the comparison to the other useful feature. After the feature selection method only essential and related variable will be remaining , hence the model will be much simpler to understand and will be more accurate [1].

Feature selection

figure 1 feature selection

Techniques of Feature selection





There are three techniques of features selection and all of them has one goal to find the best and relevant and useful features for correspond problem . All the processes have their own way of searching for the best  feature and selecting the proper correlated feature .

These are the three following Feature selection techniques:

  1. Filter Methods
  2. Wrapper Methods
  3. Embedded Methods

Filter Methods of Feature selection





Filter methods does not generate any new subset of features ,  it only compares the available features and then selects the most relevant feature amongest them and proceed with that feature. It uses the filtering of features as per their relative scoring , if the feature is more related then it scores of that feature is high and if the feature is less related with the problem then the score of that feature is low , at the end the highest scored features would be selected for the problem model .

For e.g, lets take our putting nail on the wall example, suppose we have a tool box full of tools and we need only essential tools to hammer  that nail on the wall , and to do so we need to pick the right tools and that would be done by filtering the tools ,  if there is a screwdriver and  a hammer then the score of a hammer is high because its easy to nail in the wall with the hammer , so the score of hammer is high and it is selected.

Pros:

  • The filter method does only comparison , so its fast.
  • It selects the most relevant features according to the problem

Con:

  • The generation of subset it not available in the filter method and sometime pregenerated features are not sufficient so the model has to work with only available features.
  • Filter method does not check the performance after selectin the features [2].

Wrapper methods of feature selection

Wrapper methods works in a iterative manner  , which means that it works continuously unless the proper feature is found for the particular problem.Wrapper method is quite different from the filter method and the working of wrapper method depends on the working of the problem model and thus it is more accurate. The wrapper methods selects the features and then train the problem model according to that and if it fails then it keep on trying with the other features or the new generated subsets depends on the mod of the method. There are three  mods of working of a wrapper method :

  1. Forward Selection
  2. Backward Elimination

Recursive Feature Elimination

  1. Forward Selection: In the forward selection there would be various features available and all of them would be inserted one by one as per the testing on the model and if any of them succeed that will be selected .
  2. Backward Elimination: This is just opposite of the forward selection , in the backward elimination all the variables are used in first then unwanted variables are removed with the process cycle .

We can relate this to our example, for instance there is a tool box we would try to hammer the nail with different tools , so first we choose screwdriver and it fails then we will keep on trying with the tools unless we find hammer or more essential tool that can hammer that nail.

  • Recursive Feature Elimination: In this mod the process keep on creating various models with the different features and then keep all the best performing features and worst aside and on the basis of performance all the proper working features are selected.[3]
  • Embedded Methods: Embedded methods are combination of both filter methods and wrapper methods. It’s implemented by algorithms that have their own built-in feature selection methods. Some of the most popular examples of these methods are LASSO and RIDGE regression which have inbuilt penalization functions to reduce overfitting.[4]

References

[1] Wikipedia, https://en.wikipedia.org/wiki/Feature_selection

[2]Satish Kaushik , “Introduction to Feature Selection methods with an example (or how to select the right variables?)”, https://www.analyticsvidhya.com/blog/2016/12/introduction-to-feature-selection-methods-with-an-example-or-how-to-select-the-right-variables/

[3]Sebastian Raschka,”Machine Learning FAQ”, https://sebastianraschka.com/faq/docs/feature_sele_categories.html

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Insert math as
Block
Inline
Additional settings
Formula color
Text color
#333333
Type math using LaTeX
Preview
\({}\)
Nothing to preview
Insert