Sale!

EE 559 Homework 2 solved

Original price was: $35.00.Current price is: $35.00. $21.00

Category: You will receive a download link of the .ZIP file upon Payment

Description

5/5 - (1 vote)

1. In this 3-class problem, you will use the one vs. one method for multiclass classification. Let
the discriminant functions be:
and .
The decision rule is:
Draw the decision boundaries and label decision regions and any indeterminate regions.
2. For the wine dataset, code up a nearest-means classifier with the following multiclass
approach: one vs. rest. Use the original unnormalized data. Note that the class means should
always be defined by the training data. Run the one vs. rest classifier using only the following
two features: 1 and 2.
Note that the same guidelines as HW1 apply on coding the classifier(s) yourself vs. using
available packages or routines , with one possible exception*.
Give the following:
(a) Classification accuracy on training set and on testing set.
(b) Plots showing each resulting 2-class decision boundary and regions ( )
(c) A plot showing the final decision boundaries and regions ( indeterminate).
Hint 1: For (b) and (c), you can use PlotDecBoundaries(). Modify it if necessary.
Hint 2: *If using Python, you may optionally use scipy.spatial.distance.cdist in calculating
Euclidean distance between matrix elements.
HW2 continues on next page…
g12 ( x) = −x1 − x2 + 5
g13 ( x) = −x1 + 3
g23 ( x) = −x1 + x2 −1
gji(x) = −gij(x)
x ∈Sk iff gkj(x) > 0 for all j ≠ k.
Γi
Classify the points x = (4,1, 1), (1,5, 3), and (0,0, 1). If there is an indeterminate region
prove it by
finding a point that doesn’t get classified according to the above rule. If there is no
indeterminate region, so state.
Sk
′ vs. Sk

Γ1, Γ2, Γ3,
p. 2 of 2
3. (a) Derive an expression for the discriminant function �(�)for a 2-class nearestmeans classifier, based on Euclidean distance, for class means and . Is the
classifier linear?
(b) Continuing from part (a), for the following class means:

Plot the decision boundaries and label the decision regions.
(c) Repeat part (a) except for a 3-class classifier, using the maximal value method (MVM):
find the three discriminant functions , given three class means
. Express in simplest form. Is the classifier linear?
(d) Continuing from part (c) using MVM, for the following class means:
Plot the decision boundaries and label the decision regions.
Hint: Refer to Lecture 5 and (upcoming) Lecture 6 if you have trouble with this.
4. Extra credit. DHS Problem 5.9. (Note that DHS has a set of “Problems”, and a set of
“Computer Exercises”, both at the end of each chapter. This is “Problem” 9 of Chapter 5.)
The problem statement starts “The convex hull of a set of vectors…”. Some versions of the
DHS text may have a slightly different numbering of problems, so it’s best to check every time
that you are going to solve an assigned problem.
Additional hint: Classify the point twice, once based on in the convex hull of data
points, and a second time based on in the convex hull of data points.
µ1 µ 2
µ1 = 0
−2


⎢ ⎤

⎥, µ2 = 0
1


⎢ ⎤


g1( x), g2 ( x), g3 ( x)
µ1
, µ 2
, and µ 3
µ1 = 0
−2


⎢ ⎤

⎥, µ2 = 0
1


⎢ ⎤

⎥, µ3 = 2
0


⎢ ⎤


x x S1
x S2