Exp: L L: 4.4 Comparison of Two or More Poisson Means
Exp: L L: 4.4 Comparison of Two or More Poisson Means
( )
H 0 : 1 = 2 = L = k = exp( 0 ) H 0 : 1 = 0
H a : 1 > 0 ,
or
H a : 1 < 0
or
H a : 1 0 .
x Y
j =1 j
as H a
: 1 > 0
, the test is
T ( y ) > C0 (m ), reject H 0
given m = solving
Y
j =1
where
C0 (m)
and
w(m)
can be obtained by
k ( ) ( ) P T Y C m | Y m , H is true > = j 0 0 j =1
+ w (m )P T (Y ) = C 0 (m ) | Y j
j =1
= m , H 0 is true =
and
w(m)
Note that under null hypothesis, we regard the data as having multinomial distribution with index m and parameter vector
1 1 1 , , K , independent of 0 , i.e., k k k
P Y1 = y 1 , K , Y k = y k | = m!
k
j =1
Yj = m,H
k
is true
y j!
j =1
1 k
j =1
yj m! 1 j =1 = k k y j! j =1
m! y j!
j =1
1 km
Note:
Let Y = (Y1 , Y2 ,K, YK ) has the probability density function or probability distribution function
r f ( y ) = f ( y1 , y 2 , K , y k ) = C ( , ) exp T ( y ) + j M j ( y ) h ( y ) j =1
where
T ( y ) > C0 (m ), reject H 0
C0 (m)
and
w(m)
depending on m can be
+ w (m )P (T (Y ) = C 0 (m ) | M (Y ) = m , H 0 is true ) =
Therefore, for independent Poisson random variables Y1 , Y2 , K , Yk , with means 1 , 2 , K , k , log j = 0 + 1 x j , the probability distribution function is
P (T (Y ) > C 0 (m ) | M (Y ) = m , H 0 is true )
( )
f (y ) =
j =1
yj
y j!
j =1
y j!
j =1
y j! 1
k k = exp j exp y j ( 0 + 1 x j ) j =1 j =1
j =1
y j! 1
k k k = exp exp x y + y j 1 0 j j j j =1 j =1 j =1
j =1
y j!
= C ( 1 , 0 ) exp [ 1T ( y ) + 0 M ( y )]h ( y )
3
where
k , T ( y ) = C ( 1 , 0 ) = exp j j =1
x
j =1
y j , M (y ) =
y
j =1
, h(y ) =
y
j =1
Note:
Under
E [T (Y
are
j =1
x j E (Y j ) =
j =1
x j exp ( 0 )
j =1
xj
yj j =1 k
k
since Also,
E (Y j ) = j = exp( 0 ) and
y
j =1
is the estimate of
T (Y ) depend on 0 . On
y = m
j =1
Y j, y =
j =1
Then,
k E [T (Y ) | Y = y ] = E x jY j | Y = y = j =1 k y = x j k j =1 =
j =1
x j E (Y j | Y = y )
j =1
x j
yj j =1 k
k
since Y j | Y = y ~ B y ,
y 1 E (Y j | Y = y ) = . k k
of the
The unconditio
estimate
1 1 1 1 = x y 1 + 2 xi x j y k k k k j =1 j <i
2 j
Q Cov (Y j , Yi | Y = y ) 1 1 = y , j i k k
nal variance of
variance. The conditional variance is unaffected by the addition of a constant to each component of
x j s.
l Y ( 0 , 1 ) =
k
(y
k j =1 j
log (
) )
j
[y (
j =1 k j =1
+ 1 x j ) exp ( 0 + 1 x j )
k
= 0 y j + 1 x j y j
j =1
exp (
k j =1
+ 1x j )
Denote =
exp (
k j =1
+ 1 x j ) = j .
k j =1
lY ( , 1 ) 0 y j + 1 x j y j exp ( 0 + 1 x j )
k k k j =1 j =1 j =1
= 0 y + 1 x j y j
j =1
= y log ( ) + 1 x j y j ( y log ( ) 0 y )
j =1 k
where
l Y (
y log ( )
l Y | Y ( 1 ) 1
j =1
k x j y j y log exp ( 1 x j =1
is the multinomial log-likelihood for 1 based on conditional distribution, Y1 , K, Yk | Y = m ~ M(m, 1 , K, k ) , M(m, 1 , K, k ) is a multinomial distribution with index m and parameters
j =
exp(1 x j )
exp(1x j )
k j =1
Note:
The Poisson marginal likelihood based on Y mainly depends only on while the multinomial conditional likelihood based on
Note:
The Fishers information for ( , 1 ) is