Final Exam Paper Fall 2020
Final Exam Paper Fall 2020
Final Exam Paper Fall 2020
FINAL EXAMINATION
FALL 2021
Instructions
FINAL EXAMINATION
Fall 2020
Machine Learning (3+0)
__________________________________________________________________________________
Note: Attempt all Questions.
Question 1a:
For a fully-connected neural network with one hidden layer, what effect increasing /3
the number of hidden units should have on bias and variance?
Question 1b:
Is it true that decision tree when grown to full depth, is more likely to fit the noise in /3
the data??
Question 1c:
If the hypothesis space a richer and the feature space is larger, what happens to the /3
model?
Question 1d:
What is the advantage of linear rectified activation compared to logistic sigmoid /3
activation?
Question 1e:
As the number of training data approaches infinity, what effects it will have on your /3
model trained on that data?
Question 1f:
What is Kernel (used in clustering algorithms)? What is the primary motivation for /3
using the kernel trick in machine learning algorithms?
Question 1g:
State one advantage of linear rectified activation compared to logistic sigmoid /3
activation.
Question 2:
The weather in Peshawar is notoriously fickle. For our ease we will only consider sun
/4.5
and rain and we assume that the weather changes once per day. The weather has the
following transition probabilities:
• When it rains, the probability of sun the following day is 0.6.
• When the sun shines, the probability of rain the following day is 0.3.
Draw a transition diagram, stating which states are observable and which are hidden.
Question 3:
Suppose we clustered a set of N data points using two different clustering
/4.5
algorithms: k-means and Gaussian mixtures. In both cases we obtained 5 clusters and
(0.5*8)
in both cases the centers of the clusters are exactly the same. Can 3 points that are
assigned to different clusters in the kmeans solution be assigned to the same cluster
in the Gaussian mixture solution? If no, explain. If so, sketch an example or explain
in 1-2 sentences