Geometric Insights into Support Vector Machine Behavior using the KKT Conditions

04/03/2017
by   Iain Carmichael, et al.
0

The Support Vector Machine (SVM) is a powerful and widely used classification algorithm. Its performance is well known to be impacted by a tuning parameter which is frequently selected by cross-validation. This paper uses the Karush-Kuhn-Tucker conditions to provide rigorous mathematical proof for new insights into the behavior of SVM in the large and small tuning parameter regimes. These insights provide perhaps unexpected relationships between SVM and naive Bayes and maximal data piling directions. We explore how characteristics of the training data affect the behavior of SVM in many cases including: balanced vs. unbalanced classes, low vs. high dimension, separable vs. non-separable data. These results present a simple explanation of SVM's behavior as a function of the tuning parameter. We also elaborate on the geometry of complete data piling directions in high dimensional space. The results proved in this paper suggest important implications for tuning SVM with cross-validation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset