## Description

1. Consider the table below which represents a dataset by listing each unique example with the number of

times it appears in the dataset. Construct the decision tree learned from this data by finding the most

discriminating attribute at each step. Show precisely how you decided on the most discriminating

attribute at each step by computing the expected entropies of the remaining attributes.

Example Input Attributes Class #

A B C D

x1 T T T Yes 1

x2 T T F Yes 6

x3 T F T No 3

x4 T F F No 1

x5 F T T Yes 1

x6 F T F No 6

x7 F F T Yes 2

x8 F F F No 3

2. Create a two layer neural network that uses the step function to implement (A ∨ ¬B) ⊕ (¬C ∨ D),

where ⊕ is the XOR function. You can either use the network structure provided below or another

structure you construct. After drawing your network, clearly show the weights and activation function

for each node. Assume inputs of {0, 1} for each input variable. Note that solutions with more than

two layers will still receive partial credit.

1

2