EECS 440: Machine Learning Written Problems Week 5 solved

$35.00

Category: You will receive a download link of the .ZIP file upon Payment

Description

5/5 - (1 vote)

19. Consider the primal linear program (LP): min c
T
x s.t. Ax ≥ b, x ≥ 0 and its dual: max bT
u s.t.
AT
u ≤ c, u ≥ 0. Carefully prove that for any feasible (x,u) (i.e. x and u satisfying the
constraints of the two LPs), bT
u ≤ cT
x.
20. Derive the backpropagation weight updates for hidden-to-output and input-to-hidden weights
when the loss function is cross entropy with a weight decay term. Cross entropy is defined as
L(w)=−∑yi log(oi) +(1−yi)log(1−oi), where yi is the true label (assumed 0/1) and oi is the
estimated label for the i
th example.
21. Draw an artificial neural network structure which can perfectly classify the examples shown
in the table below. Treat attributes as continuous. Show all of the weights on the edges. For
this problem, assume that the activation functions are sign functions instead of sigmoids.
Propagate each example through your network and show that the classification is indeed
correct.
x1 x2 Class
−4 −4 −
−1 −1 +
1 1 +
4 4 −
22. When learning the weights for the perceptron, we dropped the sign() activation function to
make the objective smooth. Show that the same strategy does not work for an arbitrary ANN.
(Hint: consider the shape of the decision boundary if we did this.)