Some Properties of Entropy For The Exponentiated Pareto Distribution (EPD) Based On Order Statistics
Some Properties of Entropy For The Exponentiated Pareto Distribution (EPD) Based On Order Statistics
Some Properties of Entropy For The Exponentiated Pareto Distribution (EPD) Based On Order Statistics
J. Behboodian
Islamic Azad University-Shiraz Branch
S. Tahmasebi
Shiraz University
1. Introduction
43
44 J. BEHBOODIAN AND S. TAHMASEBI
(2)
where θ and λ are two shape parameters. When θ = 1, the above dis-
kind.
tributions are discussed by Cover and Thomas [3], Lazo and Rathie [8],
statistics have been studied by a few authors. Among them Wong and
Chen [15], Park [10], Ebrahimi et al. [5] provided several results and
rived the exact form of the entropy for exponentiated pareto distribution
SOME PROPERTIES OF ENTROPY FOR THE EXPONENTIATED... 45
tic of EPD, some properties of the entropy, and mutual information for
³ ´
log fX (x) = log(θλ)+(θ−1) log 1 − (x + 1)−λ −(λ+1) log(x+1), (4)
£ ¡ ¢¤
So, we need to find E log 1 − (x + 1)−λ and E [log(x + 1)]to obtain
Z ∞ h iθ−1
r
k(r) = E [(x + 1) ] = θλ(x + 1)r−(λ+1) 1 − (x + 1)−λ dx. (6)
0
46 J. BEHBOODIAN AND S. TAHMASEBI
£ ¤
By the change of variable 1 − (x + 1)−λ = t, 0 < t < 1, we obtain:
1
E [log (x + 1)] =
[ψ(θ + 1) − ψ(1)] , (9)
λ
d
where ψ is the digamma function defined by ψ(θ) = ln Γ(θ).
dθ
Now we calculate
h³ ´r i
t(r) = E 1 − (x + 1)−λ
Z ∞ ³ ´r+θ−1
= θλ 1 − (x + 1)−λ (x + 1)−(λ+1) dx
0
θ
= (10)
θ+r
¯ h ³ ´i −1
dt(r) ¯¯
¯ = E log 1 − (x + 1)−λ = . (11)
dr r=0 θ
Putting (9) and (11) in relation (5) we have:
λ+1 −1 + θ
H(X) = − log (λθ) + [ψ(θ + 1) − ψ(1)] + , (12)
λ θ
sity fX (x) > 0. The order statistics of this sample is defined by the
n!
fYj (y) = fX (y) [FX (y)]j−1 [1 − FX (y)]n−j . (13)
(j − 1)!(n − j)!
Now, let U1 , U2 , ..., Un be a random sample from U (0, 1) with the order
1
fWj (w) = wj−1 [1 − w]n−j , 0 < w < 1, (14)
B(j, n − j + 1)
d log Γ(t) 1
where ψ(t) = , ψ(n + 1) = ψ(n) + .
dt n
Using the fact that Wj = FX (Yj ) and Yj = FX−1 (Wj ), j = 1, 2, ..., n,
computed by
48 J. BEHBOODIAN AND S. TAHMASEBI
£ ¤
H(Yj ) = Hn (Wj ) − Egj log fX (FX−1 (Wj )) (15)
Z
= Hn (Wj ) − fj (y) log fX (y)dy, (16)
Now, we can have an application of (16) for the EPD. Let X be a random
1
1 −λ
FX−1 (Wj ) = 1 − (Wj ) θ − 1,
λ+1 1
Egj [log fX (FX−1 (Wj ))] = Egj [log(θλ) + log(1 − (Wj ) θ ) +
λ
1
(θ − 1) log((Wj ) θ )]
λ+1 1
= log(θλ) + Egj [log(1 − (Wj ) θ )]
λ
+θEgj [log(Wj )]
λ+1 n!
= log(θλ) + [
λ (j − 1)!
n−j
X (−1)k (ψ(1) − ψ(θk + jθ + 1))
]
k!(n − j − k)!(k + j)
k=0
θ−1
+ (ψ(j) − ψ(n + 1)). (17)
θ
λ+1
H(Yj ) = Hn (Wj ) − log(θλ) +
λ
" n−j
#
n! X (−1) (−ψ(1) + ψ(θk + jθ + 1))
k
.
(j − 1)! k!(n − j − k)!(k + j)
k=0
1 − theta
+ (ψ(j) − ψ(n + 1)). (18)
θ
SOME PROPERTIES OF ENTROPY FOR THE EXPONENTIATED... 49
1
For the sample minimum j = 1, Hn (W1 ) = 1 − log n − and
n
1
H(Y1 ) = 1 − log n − − log(θλ) +
" nn #
X µ ¶
λ+1 u−1 n
× (−1) (ψ(θu + 1) + γ)
λ u
u=1
µ ¶
θ−1 1
+ ψ(n) + + γ , (19)
θ n
For any random variable X with H(X) < ∞, Ebrahimi et al.[10] showed
follow:
and
the bounds for the entropies of the sample minimum and maximum for
µ ¶1
λθ + 1 λ
EPD with parameters λ and θ. We have M = − 1. So,
λ+1
µ ¶1
1 λθ + 1 λ
1 − log n − − log − 1 6 H(Y1 ), (23)
n λ+1
and
µ ¶1
1 λθ + 1 λ
H(Yn ) 6 1 − log n − + (n − 1) log − 1
n λ+1
(24)
µ ¶
λ+1 1+θ
+n − log (λθ) + [ψ(θ + 1) − ψ(1)] + .
λ θ
The lower bound and upper bound for the entropies of the sample min-
vector (X, Y ) with joint density function f (x, y) and marginal density
where S is the region f (x, y) > 0 and H(X, Y ) is the entropy of (X, Y ).
Conclusion
We have derived the exact form of shannon entropy for the Exponen-
Acknowledgements
We would like to thank Executive Editor and two referees for their
References
[1] B. C. Arnold, N. Balakrishnan, and H. N. Nagaraja, A first course in
order statistics, Wiley, New York, 1991.
[9] S. Nadarajah and K. Zagrafos, Formulas for Renyi information and related
measures for univaraite distributions, Information Sciences, 155 (2003),
119-138.
[15] K. M. Wong and S. Chen, The entropy of ordered sequances and order
statistics, IEEE Transactions of Information Theory IT, 36 (1990), 276-
284.
Javad Behboodian
Department of Mathematics
Islamic Azad University - Shiraz Branch
Shiraz, Iran.
E-mail: behboodian@susc.ac.ir
Saeid Tahmasebi
Department of Mathematics and Statistics
Shiraz University
Shiraz 71454, Iran
E-mail: Stahmasby@yahoo.com