Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
100% found this document useful (1 vote)
381 views

Binary Logistic Regression

Binary logistic regression is a regression model used when the target variable is binary (can only take two values, like 0 or 1) and is predicted by one or more independent variables that can be continuous or categorical. It assumes the dependent variable is dichotomous, there is a linear relationship between continuous independent variables and the logit of the dependent variable, and an absence of multicollinearity among predictors. Key differences from linear regression include logistic regression not requiring a linear relationship between dependent and independent variables or normality of residuals.

Uploaded by

Osama Riaz
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
381 views

Binary Logistic Regression

Binary logistic regression is a regression model used when the target variable is binary (can only take two values, like 0 or 1) and is predicted by one or more independent variables that can be continuous or categorical. It assumes the dependent variable is dichotomous, there is a linear relationship between continuous independent variables and the logit of the dependent variable, and an absence of multicollinearity among predictors. Key differences from linear regression include logistic regression not requiring a linear relationship between dependent and independent variables or normality of residuals.

Uploaded by

Osama Riaz
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Binary Logistic Regression

Binary Logistic Regression


 Binary logistic regression (LR) is a regression model where the
target variable is binary, that is, it can take only two values, 0 or
1.
 Dependent variable based on one or more independent variables
that can be either continuous or categorical.
 If we have more than two categories of the dependent variable
than we will use multiple regression analysis.
Binary Logistic Regression
For Example:
Drug use can be predicted based on prior criminal
convictions, drug use amongst friends, income, age and
gender (i.e., where the dependent variable is "drug use",
measured on a dichotomous scale – "yes" or "no" – and you
have five independent variables: "prior criminal
convictions", "drug use amongst friends", "income", "age"
and "gender").
Assumptions
1. Your dependent variable should be measured on
dichotomous scale. Examples of dichotomous
variables include gender(two groups male and
female), presence of heart disease( two groups: yes
or no).
 However, if your dependent variable was not measured
on a dichotomous scale, but a contininous scale
instead, you will need to carry out multiple regression.
Assumptions
2. You have one or more independent variables,
which can be either continuous (i.e.,
an interval or ratio variable) or categorical (i.e.,
an ordinal or nominal variable).
 Examples of continuous variables include revision time
(measured in hours), intelligence (measured using IQ
score), exam performance (measured from 0 to 100),
weight (measured in kg), and so forth.
Assumptions

3. Examples of continuous variables include revision


time (measured in hours), intelligence (measured
using IQ score), exam performance (measured from
0 to 100), weight (measured in kg), and so forth.
Assumptions

4. There needs to be a linear relationship between any


continuous independent variables and the logit
transformation of the dependent variable.
Multicollinearity corresponds to a situation where the
data contain highly correlated independent variables.
This is a problem because it reduces the precision of
the estimated coefficients, which weakens the
statistical power of the logistic regression model.
There should be an adequate number of observations
for each independent variable in the dataset to avoid
creating an overfit model.
Differences
Logistic regression does not require a linear
relationship between the dependent and independent
variables. However, it still needs independent variables
to be linearly related to the log-odds of the outcome.
Homoscedasticity (constant variance) is required in
linear regression but not for logistic regression.
The error terms (residuals) must
be normally distributed for linear regression but not
required in logistic regression.
Similarities
Absence of multicollinearity
Observations are independent of each other

You might also like