ML QP
ML QP
ML QP
Module -1 Questions.
1. De4fine the following terms:
a. Learning
b. LMS weight update rule
c. Version Space
d. Consistent Hypothesis
e. General Boundary
f. Specific Boundary
g. Concept
2. What are the important objectives of machine learning?
3. Explain find –S algorithm with given example. Give its application.
Table 1
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
4. What do you mean by a well –posed learning problem? Explain the important features
that are required to well –define a learning problem.
5. Explain the inductive biased hypothesis space and unbiased learner
6. What are the basic design issues and approaches to machine learning?
7. How is Candidate Elimination algorithm different from Find-S Algorithm
8. How do you design a checkers learning problem
9. Explain the various stages involved in designing a learning system
1|Page
10. Trace the Candidate Elimination Algorithm for the hypothesis space H’ given the
sequence of training examples from Table 1.
H’= < ?, Cold, High, ?,?,?>v<Sunny, ?, High, ?,?,Same>
11. Differentiate between Training data and Testing Data
12. Differentiate between Supervised, Unsupervised and Reinforcement Learning
13. What are the issues in Machine Learning
14. Explain the List Then Eliminate Algorithm with an example
15. What is the difference between Find-S and Candidate Elimination Algorithm
16. Explain the concept of Inductive Bias
17. With a neat diagram, explain how you can model inductive systems by equivalent
deductive systems
18. What do you mean by Concept Learning?
Module -2 Questions.
Instance Classification a1 a2
1 + T T
2 + T T
3 - T F
4 + F F
5 - F T
6 - F T
(a) What is the entropy of this collection of training examples with respect to the
target function classification?
(b) What is the information gain of a2 relative to these training examples?
3. NASA wants to be able to discriminate between Martians (M) and Humans (H) based on
the following characteristics: Green ∈{N, Y} , Legs ∈{2,3} , Height ∈{S, T}, Smelly
∈{N, Y}
Our available training data is as follows:
2|Page
Species Green Legs Height Smelly
1 M N 3 S Y
2 M Y 2 T N
3 M Y 3 T N
4 M N 2 S Y
5 M Y 3 T N
6 H N 2 T Y
7 H N 2 S N
8 H N 2 T N
9 H Y 2 S N
10 H N 2 T Y
a) Greedily learn a decision tree using the ID3 algorithm and draw the tree.
b) (i) Write the learned concept for Martian as a set of conjunctive rules (e.g., if
(green=Y and legs=2 and height=T and smelly=N), then Martian; else if ... then
Martian;...; else Human).
(ii) The solution of part b)i) above uses up to 4 attributes in each conjunction. Find a set of
conjunctive rules using only 2 attributes per conjunction that still results in zero error in the
training set. Can this simpler hypothesis be represented by a decision tree of depth 2? Justify.
4. Discuss Entropy in ID3 algorithm with an example
5. Compare Entropy and Information Gain in ID3 with an example.
6. Describe hypothesis Space search in ID3 and contrast it with Candidate-Elimination
algorithm.
7. Relate Inductive bias with respect to Decision tree learning.
8. Illustrate Occam’s razor and relate the importance of Occam’s razor with respect to
ID3 algorithm.
9. List the issues in Decision Tree Learning. Interpret the algorithm with respect to
Overfitting the data.
10. Discuss the effect of reduced Error pruning in decision tree algorithm.
11. What type of problems are best suited for decision tree learning
3|Page
12. Write the steps of ID3Algorithm
13. What are the capabilities and limitations of ID3
14. Define (a) Preference Bias (b) Restriction Bias
15. Explain the various issues in Decision tree Learning
16. Describe Reduced Error Pruning
17. What are the alternative measures for selecting attributes
18. What is Rule Post Pruning
Module -3 Questions.
Module -4 Questions.
4|Page
5) Explain the k-Means Algorithm with an example.
6) How do you classify text using Bayes Theorem
7) Define (i) Prior Probability (ii) Conditional Probability (iii) Posterior Probability
8) Explain Brute force Bayes Concept Learning
9) Explain the concept of EM Algorithm.
10) What is conditional Independence?
11) Explain Naïve Bayes Classifier with an Example.
12) Describe the concept of MDL.
13) Who are Consistent Learners.
14) Discuss Maximum Likelihood and Least Square Error Hypothesis.
15) Describe Maximum Likelihood Hypothesis for predicting probabilities.
16) Explain the Gradient Search to Maximize Likelihood in a neural Net.
Module -5 Questions.
5|Page
14.Explain the Central Limit Theorem with an example.
15. Write the Procedure for estimating the difference in error between two learning methods.
Approximate confidence intervals for this estimate
6|Page