EE 559 Homework 9 solved

$35.00

Category: You will receive a download link of the .ZIP file upon Payment

Description

5/5 - (1 vote)

1. In this problem, consider a Bayes minimum-error classifier. Three classes are described by
normal densities as follows:
in which each is a variable denoting the prior for class (to save on writing). Assume
the are given.
In this problem, the algebra should be done by hand. The plots may be done using computer
or by hand.
(a) Write an expression for the discriminant functions , in two forms:
(i) Expressed in terms of Mahalanobis distance and other given terms;
(ii) Expressed only in terms of , grouped as follows:
and give expressions for and in terms of the same quantities.
(b) State the decision rule in terms of the discriminant functions . Is the classifier
linear, quadratic, or neither?
(c) For this part you are also given:
(i) Give expressions for the discriminant functions and in
terms of the given numbers, in simplest forms. Hint: before plugging in
numbers, you can simplify the by dropping some terms. Is the classifier
linear, quadratic, or neither?
(ii) Let Give equations for the 3 decision boundaries, in simplest form.
(iii) In (non-augmented) feature space, plot the class means, the 3 curves for
, and the decision boundaries. Show clearly the decision regions
and final boundaries.
p x S ( i) = N x,mi,Σi ( ), i = 1, 2, 3
P S( 1) = π1, P S( 2 ) = π 2, P S( 3) = π 3
π i Si
π i
gi( x)
dM ( x,mi)
x, mi , Σi
, and Σi
−1
gi( x) = x
T
W i x + wi x + w0
(i)
W , w, w0
gi( x)
m1 = 1
2


⎢ ⎤

⎥, m2 = 1
−1


⎢ ⎤

⎥, m3 = −2
2


⎢ ⎤


Σ1 = Σ2 = Σ3 = 1 −1
−1 2


⎢ ⎤


g1 ( x), g2 ( x) g3 ( x)
gi( x)
π i = 1
3 ∀i.
dM
2
( x,mi) = 1
p. 2 of 4
2. A Naïve Bayes classifier is a Bayes classifier in which the features, conditioned on class, are
assumed independent; thus:
.
Most of the below is similar to your work for Problem 1.
For a Naïve Bayes (minimum error) classifier, let the given class-conditional densities and
priors be:
(a) Write an expression for the discriminant functions , in the form:
and give expressions for all the weights in terms of given scalar quantities, in simplest
form. Let the mean vector of class be denoted .
(b) Is the classifier quadratic, linear, or neither?
(c) For this part you are also given:
(i) Give expressions for the discriminant functions and in
terms of the given numbers, in simplest forms. Hint: before plugging in
numbers, you can simplify the by dropping some terms. Is the classifier
linear, quadratic, or neither?
(ii) Let Give equations for the 3 decision boundaries, in simplest form.
(iii) In (non-augmented) feature space, plot the class means, the 3 curves for
, and the decision boundaries. Show clearly the
decision regions and final boundaries.
Problem 2 continues on next page…
p x S ( i) = p x j S ( i)
j=1
D
∏ ∀i = 1,!,C
p( x Si) = N x,mi,Σi ( ), i = 1, 2, 3
Σi =
σ1
(i) ( )
2
0
0 σ 2
(i) ( )
2












P(S1) = π1, P(S2 ) = π 2, P(S3 ) = π 3
gi( x)
gi( x) = w11
(i)
x1
2 + w12
(i)
x1x2 + w22
(i)
x2
2 + w1
(i)
x1 + w2
(i)
x2 + w0
(i)
Si mi = m1
(i) m2
⎡ (i)
⎣⎢ ⎤
⎦⎥
T
m1 = 1
2


⎢ ⎤

⎥, m2 = 1
−1


⎢ ⎤

⎥, m3 = −2
2


⎢ ⎤


σ 1
(i) ( )
2
= σ 1
2 = 1, σ 2
(i) ( )
2
= σ 2
2 = 2, ∀i
g1 ( x), g2 ( x) g3 ( x)
gi( x)
π i = 1
3 ∀i.
dM
2
( x,mi) = 1, i = 1,2,3
p. 3 of 4
(d) Compare your plot of 2(c)(iii) with your plot of 1(c)(iii). Does the dependence vs.
independence of features make a substantial difference in the decision boundaries and
regions?
3. This problem uses the minimum risk criterion (instead of minimum error criterion) for
classification (introduced in Lecture 21, covered in Discussion 12 and DHS 2.2). In this
problem, please use our notation ( instead of for class i ).
This problem is based on our 2-class tumor classification example. Suppose the classconditional densities are known (or assumed) to be Gaussian:
For a Bayes minimum risk classifier, answer the parts below.
(a) Write the decision rule for a 2-class Bayes minimum risk classifier, in terms of
conditional risks and . Then, write the decision rule in terms of
, , , , and .
For parts (b)-(d) below, suppose also that on average, 80% of the tumors that are to be
classified (based on their MRI images) are benign (class ), and the other 20% are
cancerous (class ).
(b) Give estimates of the class priors and .
For parts (c)-(e) below, suppose also that misclassifying a cancerous tumor as benign, is
considered much worse than misclassifying a benign tumor as cancerous. Thus we will use
for our cost coefficients:
Also, you are given that the mean vectors and covariance matrices for each class are known
(or estimated) as:
(c) Solve for the decision boundary and regions; that is, give an expression (in simplest
form, based on the given numbers) for the decision rule.
Problem 3 continues on next page…
Si ωi
p x S ( i) = N x,mi,Σi ( ), i = 1, 2
R α1 ( x) R α 2 ( x)
dM
2
( x,mi) Σ1
, Σ2 λ11, λ12 , λ21, λ22 P S1 ( ) P S2 ( )
S1
S2
P S1 ( ) P S2 ( )
λ11 λ12
λ21 λ22








= 0 10
1 0


⎢ ⎤


m1 = 1
4


⎢ ⎤

⎥, m2 = 4
2


⎢ ⎤


Σ1 = Σ2 = 0.5 −0.5
−0.5 2


⎢ ⎤


p. 4 of 4
(d) Plot in 2D feature space, the class means, the 2 curves for , the
decision boundary, and show the decision regions (by a small arrow at the boundary
pointing into , or by labelling ).
(e) If the guidelines for ordering an MRI change, and it is estimated that the data points
coming in will be, on average, 50% benign and 50% cancerous, repeat part (d) (or add
it to your part (d) plot, clearly labeled as (e)). How have the boundary and regions
changed?
dM
2
( x,mi) = 1, i = 1,2
Γ1 Γ1 and Γ2