Abstract:
Kernel functions are used in support vector machines (SVMs) to compute inner product in a higher dimensional feature space. The performance of classification or approximation depends on the chosen kernel function. There are some popular kernel functions such as linear, polynomial, and radial basis function (RBF) kernels. However, these common kernel functions may not be sufficient for the complex or large problems. This research proposes to improve the performance of SVM by using the non-negative linear combination of these common kernel functions. The obtained kernel functions are more flexible and allow better discrimination or approximation in the feature space. Then, the evolutionary strategies (ESs) are used for adjusting the parameters of SVM and the proposed kernel functions. In order to avoid the overfitting problem, the objective function in the evolutionary process is carefully designed. Training error, subset cross-validation, the bound of generalization error, and the stability of SVM are considered to be objective functions, and their experimental results are compared in this research. The proposed methods are experimented on benchmark datasets and real world problems, i.e. sentiment classification and handwritten recognition. Moreover, more flexible combined kernel functions are represented as trees. An algorithm for creating these tree kernel functions is presented, called GPES. This algorithm applies the genetic programming (GP) and the evolutionary strategy (ES) for evolving the hybrid kernel functions and their parameters. The experimental results are compared with a standard SVM classifier using the polynomial and RBF kernels with various parameter settings.