Chapter 16: Linear Support Vector Machines
This chapter introduces the linear support vector machine (SVM), a linear classifier that finds decision boundaries by maximizing margins to the closest data points, possibly allowing for violations to a certain extent.
-
Chapter 16.01: Linear Hard Margin SVM
Hard margin SVMs seek perfect data separation. We introduce the linear hard margin SVM problem as a quadratic optimization program.
-
Chapter 16.02: Hard Margin SVM Dual
In this section, we derive the dual variant of the linear hard-margin SVM problem, a computationally favorable formulation.
-
Chapter 16.03: Soft Margin SVM
Hard margin SVMs are often not applicable to practical questions because they fail when the data are not linearly separable. Moreover, for the sake of generalization, we will often accept some violations to keep the margin large enough for robust class separation. Therefore, we introduce the soft margin linear SVM.
-
Chapter 16.04: SVMs and Empirical Risk Minimization
In this section, we show how the SVM problem can be understood as an instance of empirical risk minimization.
-
Chapter 16.05: SVM Training
The linear SVM problem is challenging due to its non-differentiability. In this section, we present methods of optimization.