Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
20 views

Unit 2 Regression Analysis

Uploaded by

sbsbxbxahsbxx
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Unit 2 Regression Analysis

Uploaded by

sbsbxbxahsbxx
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

1.

Calculate the regression coefficient and obtain the lines of regression for the following data

Solution:

Regression coefficient of X on Y

(i) Regression equation of X on Y


(ii) Regression coefficient of Y on X

(iii) Regression equation of Y on X

Y = 0.929X–3.716+11
= 0.929X+7.284
The regression equation of Y on X is Y= 0.929X + 7.284

alculate the two regression equations of X on Y and Y on X from the data given below, taking
deviations from a actual means of X and Y.

Estimate the likely demand when the price is Rs.20.

Solution:
Calculation of Regression equation

(i) Regression equation of X on Y

(ii) Regression Equation of Y on X


When X is 20, Y will be

= –0.25 (20)+44.25

= –5+44.25

= 39.25 (when the price is Rs. 20, the likely demand is 39.25)

3.

Obtain regression equation of Y on X and estimate Y when X=55 from the following

Solution:

(i) Regression coefficients of Y on X


(ii) Regression equation of Y on X

Y–51.57 = 0.942(X–48.29 )

Y = 0.942X–45.49+51.57=0.942 #–45.49+51.57

Y = 0.942X+6.08

The regression equation of Y on X is Y= 0.942X+6.08 Estimation of Y when X= 55

Y= 0.942(55)+6.08=57.89

Example 9.12

Find the means of X and Y variables and the coefficient of correlation between them from the
following two regression equations:

2Y–X–50 = 0

3Y–2X–10 = 0.

Solution:

We are given

2Y–X–50 = 0 ... (1)

3Y–2X–10 = 0 ... (2)

Solving equation (1) and (2)

We get Y = 90
Putting the value of Y in equation (1)

We get X = 130

Calculating correlation coefficient

Let us assume equation (1) be the regression equation of Y on X

2Y = X+50

Linear Regression

Linear regression algorithm shows a linear relationship between a dependent (y) and one or more
independent (y) variables, hence called as linear regression. Since linear regression shows the linear
relationship, which means it finds how the value of the dependent variable is changing according to
the value of the independent variable.

The linear regression model provides a sloped straight line representing the relationship between the
variables. Consider the below image:
Mathematically, we can represent a linear regression as:

y= a0+a1x+ ε

Here,

Y= Dependent Variable (Target Variable)


X= Independent Variable (predictor Variable)
a0= intercept of the line (Gives an additional degree of freedom)
a1 = Linear regression coefficient (scale factor to each input value).
ε = random error

The values for x and y variables are training datasets for Linear Regression model representation.

Types of Linear Regression

Linear regression can be further divided into two types of the algorithm:

o Simple Linear Regression:


If a single independent variable is used to predict the value of a numerical dependent
variable, then such a Linear Regression algorithm is called Simple Linear Regression.

o Multiple Linear regression:


If more than one independent variable is used to predict the value of a numerical dependent
variable, then such a Linear Regression algorithm is called Multiple Linear Regression.

Linear Regression Line


A linear line showing the relationship between the dependent and independent variables is called
a regression line. A regression line can show two types of relationship:

o Positive Linear Relationship:


If the dependent variable increases on the Y-axis and independent variable increases on X-
axis, then such a relationship is termed as a Positive linear relationship.

o Negative Linear Relationship:


If the dependent variable decreases on the Y-axis and independent variable increases on the
X-axis, then such a relationship is called a negative linear relationship.

Implementation of Simple Linear Regression Algorithm using Python

Step-1: Data Pre-processing

1. import numpy as nm

2. import matplotlib.pyplot as mtp

3. import pandas as pd

o Next, we will load the dataset into our code:

1. data_set= pd.read_csv('Salary_Data.csv')
o After that, we need to extract the dependent and independent variables from the given
dataset. The independent variable is years of experience, and the dependent variable is
salary. Below is code for it:

1. x= data_set.iloc[:, :-1].values

2. y= data_set.iloc[:, 1].values
In the above output image, we can see the X (independent) variable and Y (dependent) variable has
been extracted from the given dataset.

o Next, we will split both variables into the test set and training set. We have 30 observations,
so we will take 20 observations for the training set and 10 observations for the test set. We
are splitting our dataset so that we can train our model using a training dataset and then test
the model using a test dataset. The code for this is given below:

1. # Splitting the dataset into training and test set.

2. from sklearn.model_selection import train_test_split

3. x_train, x_test, y_train, y_test= train_test_split(x, y, test_size= 1/3, random_state=0)

By executing the above code, we will get x-test, x-train and y-test, y-train dataset. Consider the below
images:

Test-dataset:
Training Dataset:
Step-2: Fitting the Simple Linear Regression to the Training Set:

Now the second step is to fit our model to the training dataset. To do so, we will import
the LinearRegression class of the linear_model library from the scikit learn. After importing the class,
we are going to create an object of the class named as a regressor. The code for this is given below:

1. #Fitting the Simple Linear Regression model to the training dataset

2. from sklearn.linear_model import LinearRegression

3. regressor= LinearRegression()

4. regressor.fit(x_train, y_train)

5. Output:

6. Out[7]: LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None, normalize=False)

Step: 3. Prediction of test set result:

dependent (salary) and an independent variable (Experience). So, now, our model is ready to predict
the output for the new observations. In this step, we will provide the test dataset (new observations)
to the model to check whether it can predict the correct output or not.

We will create a prediction vector y_pred, and x_pred, which will contain predictions of test dataset,
and prediction of training set respectively.

1. #Prediction of Test and Training set result

2. y_pred= regressor.predict(x_test)

3. x_pred= regressor.predict(x_train)

Output:

You can check the variable by clicking on the variable explorer option in the IDE, and also compare
the result by comparing values from y_pred and y_test. By comparing these values, we can check
how good our model is performing.

Step: 4. visualizing the Training set results:

1. mtp.scatter(x_train, y_train, color="green")

2. mtp.plot(x_train, x_pred, color="red")

3. mtp.title("Salary vs Experience (Training Dataset)")

4. mtp.xlabel("Years of Experience")
5. mtp.ylabel("Salary(In Rupees)")

6. mtp.show()

Output:

By executing the above lines of code, we will get the below graph plot as an output.

Step: 5. visualizing the Test set results:

In the previous step, we have visualized the performance of our model on the training set. Now, we
will do the same for the Test set. The complete code will remain the same as the above code, except
in this, we will use x_test, and y_test instead of x_train and y_train.

Here we are also changing the color of observations and regression line to differentiate between the
two plots, but it is optional.

1. #visualizing the Test set results

2. mtp.scatter(x_test, y_test, color="blue")

3. mtp.plot(x_train, x_pred, color="red")

4. mtp.title("Salary vs Experience (Test Dataset)")

5. mtp.xlabel("Years of Experience")

6. mtp.ylabel("Salary(In Rupees)")

7. mtp.show()

Output:
By executing the above line of code, we will get the output as:

Multiple Linear Regression

Example: Multiple Linear Regression by Hand

Suppose we have the following dataset with one response variable y and two predictor variables
X1 and X2:

Use the following steps to fit a multiple linear regression model to this dataset.

Step 1: Calculate X12, X22, X1y, X2y and X1X2.


Step 2: Calculate Regression Sums.

Next, make the following regression sum calculations:

 Σx12 = ΣX12 – (ΣX1)2 / n = 38,767 – (555)2 / 8 = 263.875

 Σx22 = ΣX22 – (ΣX2)2 / n = 2,823 – (145)2 / 8 = 194.875

 Σx1y = ΣX1y – (ΣX1Σy) / n = 101,895 – (555*1,452) / 8 = 1,162.5

 Σx2y = ΣX2y – (ΣX2Σy) / n = 25,364 – (145*1,452) / 8 = -953.5

 Σx1x2 = ΣX1X2 – (ΣX1ΣX2) / n = 9,859 – (555*145) / 8 = -200.375

Step 3: Calculate b0, b1, and b2.

The formula to calculate b1 is: [(Σx22)(Σx1y) – (Σx1x2)(Σx2y)] / [(Σx12) (Σx22) – (Σx1x2)2]

Thus, b1 = [(194.875)(1162.5) – (-200.375)(-953.5)] / [(263.875) (194.875) – (-200.375)2] = 3.148

The formula to calculate b2 is: [(Σx12)(Σx2y) – (Σx1x2)(Σx1y)] / [(Σx12) (Σx22) – (Σx1x2)2]

Thus, b2 = [(263.875)(-953.5) – (-200.375)(1152.5)] / [(263.875) (194.875) – (-200.375)2] = -1.656

The formula to calculate b0 is: y – b1X1 – b2X2

Thus, b0 = 181.5 – 3.148(69.375) – (-1.656)(18.125) = -6.867


Step 5: Place b0, b1, and b2 in the estimated linear regression equation.

The estimated linear regression equation is: ŷ = b0 + b1*x1 + b2*x2

In our example, it is ŷ = -6.867 + 3.148x1 – 1.656x2

How to Interpret a Multiple Linear Regression Equation

Here is how to interpret this estimated linear regression equation: ŷ = -6.867 + 3.148x1 – 1.656x2

b0 = -6.867. When both predictor variables are equal to zero, the mean value for y is -6.867.

b1 = 3.148. A one unit increase in x1 is associated with a 3.148 unit increase in y, on average, assuming
x2 is held constant.

b2 = -1.656. A one unit increase in x2 is associated with a 1.656 unit decrease in y, on average,
assuming x1 is held constant.

You might also like