Selection of kernel parametersSelection of kernel parameters is usually done by cross-validation on a grid of parameters. We provide here some code to do it by gradient descent. It is mostly based on the following paper:
O. Chapelle, V. Vapnik, O. Bousquet and S. Mukherjee, Choosing Multiple Parameters for Support Vector Machines, Machine Learning, 46, 2002.The code implements the following kernel methods:
The kernel parameters are found with gradient descent by minimizing either the
Learning a linear combination of kernelsA special case of this general framework is when the kernel parameters correspond to the coefficients in convex combination of base kernels,
Here is some code to select these coefficients by Newton optimization of the variance/margin estimate. It makes use of svqp, a quadratic solver written by Leon Bottou. Details about the algorithm can be found in this paper. One can show that this is a convex problem.
Fast leave-one-out error estimateMatlab code for estimating the generalization performance of an SVM, as described in Chapter 3 of my PhD thesis