Support Vector Machine (SVM)

September 1, 2017 Author: munishmishra04_3od47tgp
Print Friendly, PDF & Email

In recent years, the Artificial Neural Networks (ANNs) have been playing a significant role for variants of data mining tasks which is extensively popular and active research area among the researchers. To intend of neural network is to mimic the human ability to acclimatize to varying circumstances and the current environment. The subtle use of Support Vector Machine (SVM) in various data mining applications makes it an obligatory tool in the development of products that have implications for the human society. SVMs, being computationally powerful tools for supervised learning, are widely used in classification, clustering and regression problems. SVMs have been successfully applied to a variety of real-world problems like particle identification, face recognition, text categorization, bioinformatics, civil engineering and electrical engineering etc.



SVM have attracted a great deal of attention in the last decade and actively applied to various domains applications. SVMs are typically used for learning classification, regression or ranking function. SVM are based on statistical learning theory and structural risk minimization principal and have the aim of determining the location of decision boundaries also known as hyperplane that produce the optimal separation of classes.

Maximizing the margin and thereby creating the largest possible distance between the separating hyperplane and the instances on either side of it has been proven to reduce an upper bound on the expected generalization error. SVM based classification is attractive, because its efficiency does not directly depend on the dimension of classified entities. Though SVM is the most robust and accurate classification technique, there are several problems. The data analysis in SVM is based on convex quadratic programming, and it is computationally expensive, as solving quadratic programming methods require large matrix operations as well as time consuming numerical computations. Training time for SVM scales quadratically in the number of examples, so researches strive all the time for more efficient training algorithm, resulting in several variant based algorithms.

SVM can also be extended to learn non-linear decision functions by first projecting the input data onto a high-dimensional feature space using kernel functions and formulating a linear classification problem in that feature space. The resulting feature space is much larger than the size of dataset which are not possible to store in popular computers. Investigation on this issues leads to several decomposition based algorithms. The basic idea of decomposition method is to split the variables into two parts: set of free variables called as working set, which can be updated in all iteration and set of fixed variables, which are fixed at a particular value temporarily. This procedure is repeated until the termination conditions are met.



Basic Concept of SVM

Basically, the main idea behind SVM is the construction of an optimal hyper plane, which can be used for classification, for linearly separable patterns. The optimal hyper plane is a hyper plane selected from the set of hyper planes for classifying patterns that maximizes the margin of the hyper plane i.e. the distance from the hyper plane to the nearest point of each patterns. The main objective of SVM is to maximize the margin so that it can correctly classify the given patterns i.e. larger the margin size more correctly it classifies the patterns.

The equation shown below is the hyper plane representation:

\[ Hyperplane,aX+bY=C \]

The figure 1 shown below is the basic idea of the hyper plane describing how it looks like when two different patterns are separated using a hyper plane, in a three dimension. Basically, this plane comprises of three lines that separates two different in 3-D space, mainly marginal line and two other lines on either side of marginal lines where support vectors are located.

Hyper Plane in SVM

Figure 1: Hyper Plane

For non-linear separable patterns, the given pattern by mapping it into new space usually a higher dimension space so that in higher dimension space, the pattern becomes linearly separable. The given pattern can be mapped into higher dimension space using kernel function,​\( Φ (x) \)

\[ i.e.x→ Φ (x) \]

Selecting different kernel function is an important aspect in the SVM-based classification, commonly used kernel functions include LINEAR, POLY, RBF, and SIGMOID. For e.g.: the equation for Poly Kernel function is given as:

\[ K (x,y)= <x,y>^ P \]

Different Kernel functions create different mapping for creating non-linear separation surfaces. Another important parameter in SVM is the parameter C. It is also called a complexity parameter and is the sum of the distances of all points which are on the wrong side of the hyper plane. Basically, the complexity parameter is the amount of error that can be ignored during the classification process. But the value of classification process cannot be either too large or too small. If the value of complexity parameter is too large then the performance of classification is low and vice versa.



Application of SVM

Since, Support Vector Machine is supervised machine learning algorithm which performs on training basis, so it has mostly been implemented in network areas. For E.g.: classifying the different network application like FTP, HTTP, P2P, etc. The other works of SVM are:-Text classification, Speech recognition, Image clustering for image compression and also Image classification, hand written digit recognition problem, and many other application that requires pattern recognition technique. The SVM can also be implemented in BOTNET detection for isolation of malicious traffic, for improvement in network traffic security. Also some works can be implemented using SVM by filtering network traffic to enhance Qos (Quality of Service). The latest works in this algorithm have proven it that it can be used in recognition of shape and hand gesture in static and also in dynamic environment.

References

[1] Ashis Pradhan, “Support Vector Machine-A Survey”, International Journal of Emerging Technology and Advanced Engineering, Volume 2, Issue 8, August 2012

[2] Hetal Bhavsar and Amit Ganatra, “Variations of Support Vector Machine classification Technique: A survey”, International Journal of Advanced Computer Research, Volume-2 Number-4 Issue-6 December-2012

[3] Janmenjoy Nayak and Bighnaraj Naik, A Comprehensive Survey on Support Vector Machine in Data Mining Tasks: Applications & Challenges, International Journal of Database Theory and Application Vol.8, No.1, 2015, pp.169-186

[4] Burges, Christopher JC. “A tutorial on support vector machines for pattern recognition”, Data mining and knowledge discovery 2.2 (1998): pp. 121-167.

[5] Rekha, J., J. Bhattacharya, and S. Majumder, “Shape, texture and local movement hand gesture features for Indian sign language recognition”, 3rd International Conference on Trends in Information Sciences and Computing (TISC), 2011, IEEE, 2011.

[6] Yuan, Ruixi, et al., “An SVM-based machine learning method for accurate internet traffic classification”, Information Systems Frontiers 12.2 (2010): pp. 149-156.

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Insert math as
Block
Inline
Additional settings
Formula color
Text color
#333333
Type math using LaTeX
Preview
\({}\)
Nothing to preview
Insert