EE 559 Homework 7 solved

$35.00

Category: You will receive a download link of the .ZIP file upon Payment

Description

5/5 - (1 vote)

1. In the parts below, you will set up the primal Lagrangian and derive the dual
Lagrangian, for support vector machine classifier, for the case of data that is
separable in expanded feature space.
For the SVM learning, we stated the optimization problem for the linearly separable
(in -space) case, as:
Assume our data is linearly separable in the expanded feature space. Also,
nonaugmented notation is used throughout this problem.
(a) If the above set of constraints (second line of equations above) is satisfied, will all
the training data be correctly classified?
(b) Write the Lagrangian function for the minimization problem stated
above. Use for the Lagrange multipliers. Also state the KKT
conditions. (Hint: there are 3 KKT conditions).
(c) Derive the dual Lagrangian , by proceeding as follows:
(i) Minimize w.r.t. the weights.
Hint: solve for the optimal weight (in terms of and other
variables); and set and simplify.
(ii) Substitute your expressions from part (i) into , and use your expression from
as a new constraint, to derive as:
subject to the (new) constraint: . Also give the other two KKT
conditions on , which carry over from the primal form.
2. In this problem you will do SVM learning on 2 data points, using the result from
Problem 1 above.
u
Minimize J (w) = 1
2 w
2
s.t. zi w
T
ui + w ( 0 ) −1 ≥ 0 ∀i
L w,w0 ( ,λ)
λi
, i = 1, 2,!, N
LD
L
∇wL = 0 w* λi
∂L
∂w0
= 0
L
∂L
∂w0
= 0 LD
LD (λ) = − 1
2 λi
λ j
zi
zj
ui
T
u j
j=1
N

i=1
N









+ λi
i=1
N

λi
zi
i=1
N
∑ = 0
λi
p. 2 of 2
You are given a training data set consisting of 2 data points, in a 2-class problem. In
the expanded feature space, the data points are:
(a) Solve, by hand, for the SVM decision boundary and regions. To do this, use
Lagrangian techniques to optimize w.r.t. subject to its equality constraint
, in order to find the optimal weight vector , and also find the optimal
. Plot the training data and the decision boundary in 2D (nonaugmented) uspace, and show which side of the boundary is positive ( ).
Hints: Start from:
in which the last term has been added to incorporate the equality constraint stated
above. Once finished, then check that the resulting satisfies the KKT conditions
on that you stated in Problem 1(c)(ii); one of these can also help you find .
(b) Use the data points and solution you found in part (a), and call its solution decision
boundary . Calculate in 2D space the distance between and , and the
distance between and . Is there any other possible linear boundary in
space that would give larger values for both distances (“margins”) than gives?
(No proof required.)
u1 = −1
0


⎢ ⎤

⎥ ∈ S1; u2 = 0
1


⎢ ⎤

⎥ ∈ S2 .
LD λ
λi
zi
i=1
N
∑ = 0 w*
w0
Γ1
LD′(λ,µ) = λi
i=1
N
∑ − 1
2 λiλ jzizjui
T
u j
j=1
N

i=1
N









+ µ ziλi
i=1
N







λ
λ w0
*
H u − u1 H
u2 H u −
H