KernelMethods
KernelMethods
Kernel Methods
Manjesh K. Hanawal
Limitation of Half Spaces
Consider the domain points {−10, −9, . . . , −1, 0, 1, . . . , 9, 10},
where the labels are:
▶ +1 for all x such that |x| > 2.
▶ −1 otherwise.
This data cannot be seperated by usual halfspaces
Also consider the following 2D dataset
x2
1
x1
−2 −1 1 2
−1
−2
Ŝ = (ψ(x1 ), y1 ), . . . , (ψ(xm ), ym ).
m
X m
X sX
min αj K (xj , x1 ), . . . , αj K (xj , xm )+R αi αj K (xj , xi )
α∈Rm
j=1 j=1 i,j
(8)
To solve the optimization problem given in Equation 8, we do not
need any direct access to elements in the feature space. The only
thing we should know is how to calculate inner products in the
feature space, or equivalently, to calculate the kernel function.
DS303 Manjesh K. Hanawal 10
SVM optimization problem
Now, we will show that this is indeed a kernel function. That is,
we will show that there exists a mapping ψ from the original space
to some higher-dimensional space for which
X k
Y
= xJi xJ′ i
J∈{0,1,...,n}k i=1
X k
Y k
Y
= xJi xJ′ i .
J∈{0,1,...,n}k i=1 i=1
k
Now, if we define ψ : Rn → R(n+1) such that for
J ∈ {0, 1, . . . , n}k there is an element of ψ(x) that equals
Q k
i=1 xJi , we obtain that
DS303 Manjesh K. Hanawal 13
Polynomial Kernel
The Gaussian kernel is also called the RBF kernel, for “Radial
Basis Functions.”