Logistic Regression_ Gradient Descent_ Example
Logistic Regression_ Gradient Descent_ Example
Assume
• We have ane data point, with feature 𝒙 = 𝟎. 𝟓𝐱
• Target label 𝒚 = 𝟏.
• Initial weights 𝒘 = 𝟎. 𝟐
• Initial bias 𝒃 = 𝟎. 𝟏.
• Learning rate 𝜶 = 𝟎. 𝟏.
---------------------------------------------------------------------------------------------------------------------
Step 1: Forward Pass
1 Calculate the linear combination =
𝑧 = 𝑤 ⋅ 𝑥 + 𝑏 = 0.2 ⋅ 0.5 + 0.1 = 0.2
2 Apply the sigmoid function 𝜎(𝑧) to get the prediction 𝑗ˆ.
1 1
𝑦ˆ = 𝜎(𝑧) = −𝑧
= ≈ 0.5498
1+𝑒 1 + 𝑒 −0.2
Initial Conditions:
• Initial weight 𝑤 = 0.2
• Initial bias 𝑏 = 0.1
• Learning rate 𝛼 = 0.1
Goal:
We'll update the weights for each sample and go through one epoch of training.
---------------------------------------------------------------------------------------------------------------------
Sample 3:
1 Calculate 𝑧 :
𝑧 = 𝑤 ⋅ 𝑥 + 𝑏 = 0.2 ⋅ 2.0 + 0.1 = 0.5
2 Apply the sigmoid to get 𝑦ˆ :
1
𝑦ˆ = 𝜎(𝑧) = ≈ 0.6225
1 + 𝑒 −0.5
3 Compute the BCE Cost: With 𝑦 = 1 and 𝑦ˆ ≈ 0.6225 :
BCE ≈ −log(0.6225) ≈ 0.4741
Sample 4:
1 Calculate 𝑧 :
𝑧 = 𝑤 ⋅ 𝑥 + 𝑏 = 0.2 ⋅ 3.0 + 0.1 = 0.7
2 Apply the sigmoid to get 𝑦ˆ :
1
𝑦ˆ = 𝜎(𝑧) = ≈ 0.6682
1 + 𝑒 −0.7
3 Compute the BCE Cost: With 𝑦 = 0 and 𝑦ˆ ≈ 0.6682 :
BCE ≈ −log(1 − 0.6682) ≈ 1.1015