Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

EE5180_linear_regression_tutorial

The document contains practice problems for an Introduction to Machine Learning course, focusing on linear regression. It includes tasks such as proving the existence of solutions for linear equations, demonstrating the invertibility of a matrix, and deriving expressions for minimizing squared error in regression problems. Additionally, it covers polynomial regression feature mappings and the impact of Gaussian noise on least-squares solutions.

Uploaded by

goutham.gurijala
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

EE5180_linear_regression_tutorial

The document contains practice problems for an Introduction to Machine Learning course, focusing on linear regression. It includes tasks such as proving the existence of solutions for linear equations, demonstrating the invertibility of a matrix, and deriving expressions for minimizing squared error in regression problems. Additionally, it covers polynomial regression feature mappings and the impact of Gaussian noise on least-squares solutions.

Uploaded by

goutham.gurijala
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE5180 : Introduction to Machine Learning

Practice Problems
Topics: Linear Regression

1. Consider the system of 9equations in x ∈ Rm :

AT Ax = AT b

where A ∈ Rn×m , b ∈ Rn .Show that there always exists a solution for the system of
equations above.

2. Let B = AT A + λI for some lambda > 0. Show that B is invertible.

3. Let x∗ be some solution to


AT Ax = AT b
Prove that Ax∗ is the projection of b onto the column space of A. (Hint: Consider the
line joining Ax∗ and b, show that it is orthogonal to the column space of A).

4. Find the line of the wx + b that minimizes the squared error for the following 1-d
regression problem:

x y
1 1
2 2
4 3
5 4
6 4

(Hint: Consider least squares solution with the feature vector ϕ : R 7→ R2 given as
ϕ(x) = [x, 1].)

5. Let ϕ : Rd 7→ Rd be the feature mapping for performing k-degree polynomial regression
on a d-dimensional input. For example if k = 2, d = 2 then ϕ([x1 , x2 ]) = [1, x1 , x2 ,
x21 ,x22 , x1 x2 ]

• Give ϕ for k = 2, d = 3
• Give ϕ for k = 3, d = 2

1
6. Consider a data set in which each data point yn is associated with a weighting factor rn
> 0, so that the sum-of-squares error function becomes
N
1X
ED (w) = rn {yn − wT ϕ(xn )}2 (1)
2 n=1

Find an expression for the solution w∗ that minimizes this error function.

7. Let y = Xw + n , where n is Gaussian noise N (0, σ 2 I).


We know that least-square solution is: wML = (X T X)−1 X T y

• Show that E (wML ) = w


• Show that cov(wML ) = σ 2 (X T X)−1

You might also like