Chapter 17: Nonlinear Support Vector Machines
Many classification problems warrant nonlinear decision boundaries. This chapter introduces nonlinear support vector machines as a crucial extension to the linear variant.
- 
    Chapter 17.01: Feature Generation for Nonlinear Separation
    
      
        We show how nonlinear feature maps project the input data to transformed spaces, where they become linearly separable. 
- 
    Chapter 17.02: The Kernel Trick
    
      
        In this section, we show how nonlinear SVMs work their magic by introducing nonlinearity efficiently via the kernel trick. 
- 
    Chapter 17.03: The Polynomial Kernel
    
      
        In this section, we introduce the polynomial kernel in the context of SVMs and demonstrate how different polynomial degrees affect decision boundaries. 
- 
    Chapter 17.04: Reproducing Kernel Hilbert Space and Representer Theorem
    
      
        In this section, we introduce important theoretical background on nonlinear SVMs that essentially allows us to express them as a weighted sum of basis functions. 
- 
    Chapter 17.05: The Gaussian RBF Kernel
    
      
        In this section, we introduce the popular Gaussian RBF kernel and discuss its properties. 
- 
    Chapter 17.06: SVM Model Selection
    
      
        In this section, we discuss the importance of SVM hyperparameters for adequate solutions.