Statistics For Management and Economics, Sixth Edition: Formulas
Statistics For Management and Economics, Sixth Edition: Formulas
Statistics For Management and Economics, Sixth Edition: Formulas
Formulas
Population mean
∑ xi
i= 1
m =
N
Sample mean
∑ xi
i =1
x=
n
Range
Population variance
∑ ( xi
2
− m)
2 i= 1
s =
N
Sample variance
∑ (x i − x )
2
s2 = i= 1
n−1
Population standard deviation
2
s = s
s= s2
Population covariance
N
∑ ( x i − m x )( y i − m y )
i= 1
COV(X,Y) =
N
Sample covariance
∑ ( xi − x )( y i − y )
i= 1
cov(x,y) =
n−1
Population coefficient of correlation
COV ( X , Y )
r=
s xs y
cov( x, y )
r=
sx s y
cov( x, y )
b1 =
s x2
b0 = y − b1 x
Probability
Conditional probability
Complement rule
C
P( A ) = 1 – P(A)
Multiplication rule
Addition rule
E(X) = µ = ∑ xp ( x )
all _ x
Variance
∑ (x − µ )
2 2
V(x) = σ = p( x )
all _ x
Standard deviation
2
s = s
Covariance
COV(X, Y) = ∑ ( x − m x )( y − m y ) p( x, y )
Coefficient of Correlation
COV ( X ,Y )
r =
s xs y
1. E(c) = c
2. E(X + c) = E(X) + c
3. E(cX) = cE(X)
Laws of variance
1.V(c) = 0
2. V(X + c) = V(X)
3. V(cX) = c 2 V(X)
k k
1. E (∑ X i ) = ∑ E( Xi )
i=1 i=1
k k
2. V ( ∑ Xi ) = ∑ V ( Xi)
i=1 i=1
= w12 s 12 + w22 s 22 + 2 w1 w2 r s 1 s 2
k
E(Rp ) = ∑ wi E ( Ri )
i=1
k k k
∑ wi s i + 2∑ ∑ wi wj COV ( Ri , R j )
2 2
V(Rp ) =
i=1 i= 1 j = i + 1
Binomial probability
n!
P(X = x) = p x (1 − p) n − x
x! ( n − x )!
m= np
2
s = np(1 − p )
s = np (1 − p)
Poisson probability
e−m mx
P(X = x) =
x!
Continuous Probability Distributions
E ( X ) = mx = m
2
2 s
V(X) = sx =
n
s
sx =
n
X −m
Z=
s/ n
E (Pˆ ) = mpˆ = p
2 p (1 − p )
V ( Pˆ ) = s ˆp =
n
p (1 − p )
s pˆ =
n
Pˆ − p
Z=
p (1 − p ) n
E ( X 1 − X 2 ) = mx1 − x 2 = m1 − m2
2 2
2 s1 s2
V ( X 1 − X 2 ) = s x1 − x2 = +
n1 n2
Standard error of the difference between two means
2 2
s1 s2
sx −x = +
1 2
n1 n2
( X 1 − X 2 ) − ( m1 − m2 )
Z=
2 2
s1 s2
+
n1 n2
Introduction to Estimation
s
x ± za / 2
n
2
za / 2 s
n =
W
x−m
z=
s / n
x−m
t=
s/ n
Interval estimator of m
s
x ± ta / 2
n
2
Test statistic for s
2 ( n − 1) s 2
c = 2
s
2
Interval Estimator of s
(n − 1) s 2
LCL = 2
ca /2
( n − 1) s 2
UCL = 2
c 1− a /2
pˆ − p
z=
p (1 − p) / n
Interval estimator of p
pˆ ± z a / 2 pˆ (1 − pˆ ) / n
2
z a / 2 pˆ (1 − pˆ )
n =
W
Equal-variances t-test of m1 − m 2
( x1 − x2 ) − ( m1 − m2 )
t= n = n1 + n2 − 2
1 1
s 2p +
n1 n 2
1 1
( x1 − x 2 ) ± t a / 2 s 2p + n = n1 + n2 − 2
n1 n2
Unequal-variances t-test of m1 − m 2
( x1 − x2 ) − ( m1 − m2 ) ( s12 / n1 + s 22 / n 2 ) 2
t= n =
s12 s 22 ( s12 / n1 ) 2 ( s 22 / n 2 ) 2
+ +
n −1
n1 n 2 1 n2 − 1
s12 s 22 ( s12 / n1 + s 22 / n 2 ) 2
( x1 − x 2 ) ± t a / 2 + n =
n1 n 2 ( s12 / n1 ) 2 ( s 22 / n 2 ) 2
+
n −1 n2 − 1
1
t-Test of m D
xD − mD
t= n = nD − 1
sD / nD
t-Estimator of m D
sD
x D ± ta / 2 n = nD − 1
nD
2
F-test of s 1 / s 22
s12
F= n1 = n1 − 1 and n 2 = n2 − 1
s 22
2
F-Estimator of s 1 / s 22
s12 1
LCL =
s 22 Fa / 2 ,n 1 ,n 2
s 2
UCL = 12 F
s a / 2 ,n 2 ,n 1
2
( pˆ 1 − pˆ 2 )
Case 1: z=
1 1
pˆ (1 − pˆ ) +
n1 n2
( pˆ 1 − pˆ 2 ) − ( p1 − p 2 )
Case 2: z=
pˆ 1 (1 − pˆ 1 ) pˆ 2 (1 − pˆ 2 )
+
n1 n2
z-Interval estimator of p1 − p 2
pˆ 1 (1 − pˆ 1 ) pˆ 2 (1 − pˆ 2 )
( pˆ 1 − pˆ 2 ) ± za / 2 +
n1 n2
Analysis of Variance
k
SST = ∑ nj (xj − x)2
j= 1
k nj
∑ ∑ ( x ij − x j )
2
SSE =
j= 1 i =1
SST
MST =
k −1
SSE
MSE =
n− k
MST
F=
MSE
Two-way analysis of Variance (randomized block design of experiment)
k b
∑ ∑ ( x ij − x )
2
SS(Total) =
j= 1 i =1
k
SST = ∑ b( x [T ] j − x ) 2
i=1
b
SSB = ∑ k ( x[ B] i − x ) 2
i= 1
k b
∑ ∑ ( x ij − x [T ] j − x [ B] i + x )
2
SSE =
j= 1 i =1
SST
MST =
k −1
SSB
MSB =
b −1
SSE
MSE =
n− k −b+1
MST
F=
MSE
MSB
F=
MSE
Two-factor experiment
a b r
∑ ∑ ∑ ( x ijk − x )
2
SS(Total) =
i = 1 j = 1 k =1
a
SS(A) = rb ∑ ( x [ A] i − x ) 2
i =1
b
SS(B) = ra ∑ ( x[ B ] j − x ) 2
j= 1
a b
SS(AB) = r ∑ ∑ ( x [ AB]ij − x[ A]i − x[ B] j + x )2
i =1 j =1
a b r
SSE = ∑ ∑ ∑ ( xijk − x[ AB]ij ) 2
i= 1 j= 1 k = 1
MS ( A)
F=
MSE
MS ( B)
F=
MSE
MS ( AB)
F=
MSE
Least Significant Difference Comparison Method
1 1
LSD = t a / 2 MSE +
ni n j
MSE
w= q a ( k ,n )
ng
Chi-Squared Tests
k
( f i − ei ) 2
∑
2
c =
i =1 ei
T = T1
n1 ( n1 + n 2 + 1)
E(T) =
2
n1 n 2 (n1 + n 2 + 1)
sT =
12
T − E (T )
z=
sT
x − .5n
z=
.5 n
T = T+
n( n + 1)
E(T) =
4
n( n + 1)( 2n + 1)
sT =
24
T − E (T )
z=
sT
Kruskal-Wallis Test
12 k T 2
∑
j
H = − 3( n + 1)
n (n + 1) j = 1 n j
Friedman Test
12 k
Fr = ∑ T j2 − 3b (k + 1)
b( k )( k + 1) j =1
Sample slope
cov( x , y)
b1 =
s 2x
Sample y-intercept
b0 = y − b1 x
∑ ( y i − yˆ i )
2
SSE =
i= 1
SSE
se =
n− 2
b1 − b 1
t=
s b1
Standard error of b1
se
s b1 =
( n − 1) s 2x
Coefficient of determination
2
1 ( xg − x )
yˆ ± t a / 2 , n − 2 s e 1 + +
n ( n − 1) s x2
2
1 ( x g − x)
yˆ ± t a / 2 , n − 2 s e +
n ( n − 1) s x2
cov( x , y )
r=
sxsy
n− 2
t=r
1− r 2
cov( a, b)
rS =
sa sb
rS − 0
z= = rS n − 1
1/ n − 1
Multiple Regression
SSE
se =
n−k−1
bi − b i
t=
sb i
Coefficient of Determination
SSE /( n − k − 1)
Adjusted R2 = 1−
∑ ( y i − y ) /( n − 1)
2
MSE = SSE/k
MSR = SSR/(n-k-1)
F-statistic
F = MSR/MSE
Durbin-Watson statistic
∑ (ei − ei 1 ) 2
−
i= 2
d = n
∑ ei2
i= 1
Exponential smoothing
S t = wyt + (1 − w) S t −1
Statistical Process Control
Centerline = x
S
Lower control limit = x − 3
n
S
Upper control limit = x + 3
n
Centerline = p
p (1 − p )
Lower control limit = p − 3
n
p (1 − p )
Upper control limit = p + 3
n
Decision Analysis