Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 32

ITC NOTES UNIT-1

Moments and Variance Let X be a random variable and define another random variable Y as a function of X so that Y = g(X). Suppose we wish to compute, the value of Y i.e. E Y!. "n order to appl# ($.%&), the e'pected value of g (X) ma# be e'pressed as(

) g(' ) +(' )
* *

if X is discrete

E Y! = E g(X)! = ,

- g (') f
0,

(') d'

if X is continuous

...($.%/)

+rovided that the sum and integral on the 1.2.S is also absolutel# convergent. 3 special case of interest is the power function g (X) = X4 for 4 = $,5,6, ..... E X4! is 7nown as the 4th moment of the random variable X. i.e. g (') = X4
,

...($.89)
4

and

E X4! =

-'
0, ,

f' (') d'

+utting 4 = $, the above e:uation becomes

E X! =

0,

'

fX (') d'

...($.8$)

2ence, the first moment of random variable X will be the same as the e'pected or the mean value. "f 4 = 5, the e:uation ($.89) becomes
,

E X5! =

0,

'5

fX (') d'

...($.85)

;here E X5! is 7nown as the mean s:uare value of random variable X. Similarl#, the central moments are the moments of the difference between random variable X and its e'pected value E X!. <herefore, the 4th central moment ma# be given as(
,

E =X0E(X)>4! =

- = ' ? E(X)>
0,

fX (') d'

...($.86)

<he second central moment for 4 =5, is 7nown as variance of random variable X i.e. @ar X! = E = X0 E(X) >5! =

- =' ?E(X)> f
5 0,

(') d'

...($.8A)

@ariance is generall# denoted b# B'5 i.e.


,

B'5 = @ar X! =

- =' ?E(X)> f
5

(') d'

...($.8%)

<he s:uare root of variance is 7nown as standard deviation of a random variable X. Standard deviation gives the spread observed over the values of X related to the e'pected value. <herefore, standard deviation = C@ariance = CB'5 = B' 1.11.3 Covariance and Correlation Coefficient <he (4, n)th moment of a two0dimensional random (X,Y) is defined as(
, ,

E X4 Yn! =
0,

--'
0, ,

#n fXY (',#) d' d#

...($.88)

<he ($, $)th moment of (X, Y) is


,

E X Y! =
0,

- - '# f
0,

XY

(',#) d' d#

...($.8D)

is called the correlation of X and Y. "f E XY! = 9, then we sa# that X and Y are orthogonal. <he covariance of X and Y, denoted b# Eov X, Y! or BXY is defined as Eov X, Y! = BXY = E =X ? E(X)> =Y ? E(Y)>! E'panding e:uation ($.8&), we obtain Eov X, Y! = E X Y! ? E X! E Y! ...($.8/) .....($.8&)

"f Eov X,Y! =9, then we sa# that X and Y are uncorrelated. From e:uation ($.8/), we see that X and Y are uncorrelated if E X Y! ? E X! E Y! ...($.D9)

Gote that if X and Y are independent, then it can be shown that the# are uncorrelated, but the converse is not true. <he correlation coefficient, H X, Y! or HXY is defined as(

BXY I X,Y! = HXY = BX B# =

Eov X, Y!

@ar X! . @ar Y!

...($.D$)

1.13. CHA ACTE ISTIC T ANS!O MATION !UNCTIONS O! AN"OM VA IA#$ES "n man# probabilit# problems, the form of the densit# function ma# be so comple' so as to ma7e computations difficult, if not impossible. <he transform methods are useful in problems involving sum of independent random variables and in solving difference e:uations (discrete case) and differential e:uation (continuous cas). ;e will introduce the characteristic transformation function as (i) J0transform and (iii) fourier transform. ;e will first define the moment generating function and then derive the above three transforms as special cases. For random variable X, eXK is another random variable. <he e'pectation or mean of e XK i.e. E eXK! will be the function of K. <he moment generating function (LFM) LX (K), of the random variable X ma# be defined as( LX (K) = E eXK! ...($.&/)

)e
i LX (K) =
,

'i K

+ ('i)

if X is discrete

-e
0,

'K

f' (') d' if X is continuous

...($./9)

L' (K) ma# not e'ist for all real numbers K, but there will be an invertal of K values within which L' (K) does e'ist. <he closel# related characteristic function of random variable X is given b#( G X (N) = G (N) = LX (* N) ...($./$) 2ere * denotes C 0$. G (N) is 7nown as Fourier transform. <he advantage of fourier transform is that for an# X, its characteristic function, G X (N) is alwa#s defined for all values of N. "f X is a non0negative continuous random variable, then we define the (one0sided) Laplace transform(
,

LX (S) = L(S) = LX (0S) = e0S' f(')d'


9

...($./5)

Finall#, if X is a non0nagative integer valued discrete random variable, then we define its J0 transform( MX(O) = M(O) = E OX! = LX (lnJ)
,

Pr

MX(O) =

)
* =9

+X (*)J* ...($./6)

1.13.1 %ro&erties of Moment 'eneratin( !)nction <he reasons for the usefulness of transform methods will be e'plained b# the following properties of transforms. $. $inerarit* + Let <hen Y = aX Q b L Y (K) = E (eYK) = E e(aX Qb)K! = E ebK . eaXK! = ebK E eaXK! = ebK LX (aK) b# the Linearit# propert# of e'pectation. 5. Let X$, X5,....Xn be mutuall# independent random variables on a given probabilit# space, and
n

...($./A)

Let Y =

) ' "f L (K) for all *, then L (K) e'ists, and


*. X Y * =$

LY (K) = LX$ (K). LX5(K)...LXn(K)

...($./%)

<hus the moment generating function of a sum of independent random variables is the product of the moment generating functions. 3. If MX1 (Q) = MX2 (Q) for all Q, then FX$ (') =FX5 (') for all '. "n other words, if two random variables have same transform, then the# have the same distribution function.

1.13., "etermination of Statistical Avera(es )sin( M'! <o stud# the moment generating propert# of LMF, eXK can be e'panded into a power series( XK X5 K5 X4 K4 XK e = $Q Q Q.......Q .... ... ($./8) $R 5R 4R <a7ing e'pectations on both sides (assuming all the e'pectations e'ist)( L K! = E eXK! Pr Pr E X5!K5 E X4!K4 L K! = $QE X!KQ Q.... 5R 4R E X4!K4 , L K! = Q....

)
4R ...($./D)
K4

4=9

<herefore, the coefficients of E X4! of the random variable X. 3lternativel#( d4 L E X4! = S K4 Gote that, E X9! = L (9) = $

in the power series e'pansion of its LMF uields the 4th moment 4R

,
K=9

4 = $,5,...

...($./&)

Ill)strative E-am&le 1.. !ind t/e moment (eneratin( f)nction of t/e e-&onential distri0)tion+ $ fX (') = E 2ence, find its mean and variane. Sol)tion. <he LMF of e'ponential distribution is given as( $ e'K. E e0 'TE d' .K U E $ e0'TE , 9U ' U , , E V 9

LX (K) =

$
,

or

LX (K) =

- e(
E 9

K0$TE)'

d'

$ or LX (K) = E K

e(K0$TE)'! $ E

Pr Pr Gow, or or and

LX (K) =($ ? EK)0$ LX (K) = $QEKQE5 K5 QE6 K6 Q... d E X! = dK


5

LX (K)
K=9

E X! = (EQ 5 E KQ 6 E6 K5 Q..)K=9 E X! =E d5 E X5! = dK LX (K)


K=9

= 5E5

@ar X! = E X5!0 (E X!)5 = 5E5 ? E5 =E5 2ence, the mean is E and variance is E5 $ESSON -3 1.11 CONVE 'ENCE O! A SE2UENCE O! AN"OM VA IA#$ES

Let X $, X 5, X 6,....X n be n mutuall# independent identicall# distributed random variables. 3n n0 tuple of values (X $, X 5,..... X n) where ' i is a specific value of ' i , ma# be thought of as n independent measurements of some :uantit# that is distributed according to their common distribution. "n this sense, we sometimes spea7 of n0tuple (X $, X 5,..... X n) as a random sample of siOe n from this distribution. "t is intuitivel# clear that the running average of this se:uence, i.e.

Y n=

)
i=$

X"

..($.//)

Should converge to the average of the random variables. <wo limit <heorems, i.e. the law of large number (LLG) and central limit theorem (EL<), state how the running average of the random variable behaves as n becomes large.

1.11.1 $a3 of lar(e n)m0ers <he law of large numbers states that if =X i, i=$, 5, ...n> is a se:uence of mutuall# independent identical distributed random variables with E X $ U , , then
$

)
i=$

Xi

W E X$

..($.$99)

where the t#pe of convergence is convergence almost ever# where or convergence almost surel#, meaning the set of points in the probabilit# space foe which the left0 hand side does not converge to the right0 hand side has Oero probabilit#. 1.11., Central $imit T/eorem <he Eentral limit <heorem states that if =X i, i = $, 5, . n> is a se:uence of mutuall# independent identical distributed random variables with mean, m=E Xi! U , and o5 = var Xi! U ,, then we have

0
n

)
i=$ B Tn

Xi ? E Xi !

Jn

W G (9,$)

..($.$9$)

Such that E Jn! = 9 and @ar Jn! = $. <hen under certain regularit# conditions, the limiting distribution of Jn is standard normal distribution denoted Jn W G (9,$) means normal distribution having Oero mean and unit# variance. <hat is

0
n

)
i=$ B Tn

Xi ? E Xi !

Jn

W G (9,$)

..($.$95)

<he t#pe of convergence in the EL< is convergence in distribution, meaning the ESF of the left0hand side converges to the ESF of G (9,$) as n increases.

SO$VE" E4AM%$ES

E-am&le 1.1 <wo dies are thrown simultaneousl#. Find the probabilit# of getting a 8. Solution. ;e 7now that each die has si' faces. <herefore, the total number of possible outcomes are, 8 '8 = 68. <he sum of 8 can be obtained on throwing two dies simultaneousl#, as ($,%), (5,A), (6,6), (A,5), (%,$) i.e in % wa#s. So that the possibilit# of getting 8 is + (3). Gumber of outcomes getting sum 8 Gow, + (3) = <otal number of possible outcomes % = 68

E-am&le 1., <hree Students 3, X and E are given a numerical in their ph#sics class. <he probabilit# of solving their numerical are $T5, 5T6 and 6TA respectivel#. Seterminate the probabilit# that the given numerical is solved, if all of them tr# to solve that numerical. Sol)tion. <he probabilit# that student 3 will not able to solve the numerical is $ $0 A
5 $0 6 6 $ i.e A 6 and A $ respectivel#.

6 = A

Similarl#, the probabilities that X and E, will not be able to solve it are , $0

<herefore, the probabilit# that no one will solve the problem + (X)

6 + (')

=
A

'
6

'
A

=
$8

2ence, the probabilit# that the problem will be solved is + (') = $0 +(')

= $0

$ $8

$% $8

E-am&le 1.5 A can /it a tar(et 3 times in 5 s/ots6 # can /it 3 times in 5 s/ots and C can /it 3 in 1 s/ots. T/e* fire a volle*. 7/at is t/e &ro0a0ilit* t/at 8i9 t3o s/ots /it6 8ii9 at least t3o s/ots /it:
6 % 5 % 6 A

Sol)tion.

+robabilit# of 3 hitting the target = +robabilit# of X hitting the target = +robabilit# of 3 hitting the target =

(i)

"n order that two shots ma# hit the target, following cases must be considered( + $ = chance that 3 and X hit and E fails to hit.
6 % 5 % 6 A 8 $99

'

' $0

+ 5 = chance that X and E hit and 3 fails to hit.


5 % 6 A 6 % $5 $99

'

' $0

+ 6 = chance that E and 3 hit and X fails to hit.


6 A 6 % 5 % 5D $99

'

' $0

Since, these are mutuall# e'clusives events, the probabilit# that an# 5 shots hit = + $ Q+ 5 Q + 6 =
8 $99

$5 $99

5D $99

= 9.A%

(ii)

"n order that atleast two shots ma# this the target, we must also consider the case of all 3, X, E hitting the target, for which + A = chance that 3, X, E all hit =
6 %

'

5 %

'

6 A

$& $99

Since all these are mutuall# e'clusive events, the probabilit# of atleast two shots hit = + $ Q+ 5 Q + 6 Q + A
8 $5 5D $&

$99

$99

$99

$99

= 9.86

,.,. AN"OM % OCESSES Eonsider a random e'periment specified b# the outcome s from some sample space S. <hat means the outcome s will be one of the sample point in the sample space S, ever# time the e'periment is conducted. "f the outcome s is associated with the parameter t, then the function of outcome s and time t i.e. X (s, t) is 7nown as random &rocess. 3 random process X (s, t) is therefore a function of two parameters, the outcome s and time t. For a specific s. sa# si, we have single time function X (si, t) = Xi (t). <his time function is called 3 sam&le f)nction a reali;ation of t/e &rocess. For a specific time tY, XtY (s) = X (tY, s) is a random variable denoted b# X (tY )! as a varies over the sample space S, ;hen both s and t are varied we have the famil# of random variables constituting a random process, he schematic diagram representing the random process is giving in Fig. 5.$, b# illustrating the concepts of the sample space of the random e'periment, outcomes of the e'periment, associated sample functions and random variables resulting from ta7ing two measurements of the sample functions. Sample space S

.
s1

X$(t) t

.
.
S2 X2(t)

X$(t$)

X$(t5) t X5(t$) X5(t5) Xn(t5) Xn(t$) t$ t5 t

. . . .s
outcome
n

Xn(t)

!i(. ,.1 andom &rocess <hus, a random process is a famil# of random variables = X (t) T t <> , defined on a given probabilit# space, inde'ed b# the parameter t, where t varies over an inde' set <. From the above figure, we note that for a fi'ed time t Y inside the observation interval, the set of numbers, X$(tY), X5(tY),...,Xn(tY)! = X(s$,tY),X (s5,tY)....,X (sn,tY)! ......(5.$)

Eonstitutes a random variable. <hus, we have an inde'ed ensemble of random variable X (s, t) which is called a random process.

2owever, we ma# distin()is/ between a random varia0le and a random &rocess as( for a random variable, the outcome of a random e'periment is mapped into a numberZ whereas for a random process, the outcome of a random e'periment is mapped into a waveform that is a function of time. For a fi'ed time, t = t$, X (t$) is a single random variable that describes the state of the process at time t$. For a fi'ed number '$, the probabilit# of the event X(t $) [ '$! gives the ESF of the random variable X (t$), denoted b# F('$Zt$) = F'(t$) ('$) = + X(t$) [ '$! ...(5.5) ;here F('$Z t$) is 7nown as the first < order distri0)tion of the process = X (t) >. Miven two time instants t$ and t5, X (t$) and X (t5) are two random variables on the same probabilit# space. <heir Yoint distribution is 7nown as the second-order distri0)tion of the process and is given b#( F('$,'5Zt$,t5) = + X (t$) [ '$, X(t5) [ '5! ...(5.6) "n general, we define the nt/ order =oint distri0)tion of the random process =X (t),t \ <> b# ( F (XZ t) =F('$,'5...'nZ t$,t5...tn) =+ X(t$) [ '$,X(t5) [ '5,..X(tn) [ 'n! .....(5.A)

For all X =('$,'5......'n) \ 1n and t = (t$,t5...tn) \ <n such that t$ U t5 U..U tn] <he corresponding nt/ order densit* f)nction is

^'('$,'5]]]]]]]]]]'nZt$,t5]]]]]]tn)=_n F'('$,'5]]]]]]]'nZ t$,t5]]]]]]tn


_'$_'5]]]]]]_'n

,.3 STATISTICA$ AVE A'ES O! AN"OM % OCESS ;e 7now that the random variables ma# be specified with the help of different probabilit# distributions. 1andom processes cannot be completel# specified with the help of +SF`s. Statistical averages, in general, ensemble and time statistics are use to specif# the random processes. Lost generall#, statistical averages such as means m' (t) and auto0correlation function 1a (t$,t5) are used to describe a random process.

,.3.1. Ensem0le Avera(es "n the ensem0le avera(es, the average is ta7en over the ensemble of waveforms, 7eeping the time constant or fi'ed. 3s an e'ample, if we desire ensemble mean alue of random process of Fig.5.$, than we have to ta7e the mean of '$(t$),'5(t5),'6(t6)...'n(tn), this ensemble mean is ta7en at time t = t$. "n the same wa#, we can ta7e ensemble mean at time t = t 5 also. <he ensemble means at t = t$ and t = t5 need not we same. Similarl#, ensemble means at other values of t ma# also be evaluated. <hus, ensemble mean or e'ception is a function of time t and denoted b# m' (t) or E X(t)!. ;e 7now that the mean value or e'pectation of a continuous random variable is given b# , m' =E X! = >- ^'(')d' 0, ...(5.8)

bsing this e'pression the ensemble mean or e'pectation ma# be defined as( , m' (t) =E X(t)! = >- ^'(',t) d' ...(5.D) 0, "n this e:uation, the time t is treated as constant and the integration is done with respect to '. <he ne't ensemble average is the auto0correlation function which ma# be e'pressed as( 1''(r) = E X(t$).X(t5)! , , Pr 1''(r) = >

> '$'5 ^' ('$,'5Zt$,t5)d'$ d'5

....(5.&)

0,0, <he value of 1r (t$t5) represents the similarit# of amplitudes at time t $ and t5Z and is obtained b# multipl#ing the amplitudes of sample function at t$ and t5 ta7ing the mean of this product over an ansemble. ,.3.,. Time Avera(es ;hen the statistical averages are ta7en along the time, the# are 7nown as time averages. ;e ma# difien mean value of a sample function '(t) as( X=Um'VLim = < < $ <T5

>

X(t) dt

........(5./)

<T5 <he e:uation (5./) is a standard e:uation to find average or mean value of a function. <he auto ? correlation function ma# be e'pressed using time averages as( <T5 1'' (r) = ' (t). '(tQr) ? Lim = $ < , 0<T5 ,.1 STATINA @ AN"OM % OCESS

>'(t). ' (tQr)dt ......(5.$9

3s alread# e'plain, the nth order Yoint distribution of a random process = X (t), t < >. "s given b#( F' (XZt) = + X(t$)[ '$ X(t5)[ '5,...X(tn)['n! = F' ('$,'5...'nZt$,t5...tn) 3nd nth order Yoint densit# function is given b# = ^'('Zt)=^' ('$,'5...'nZt$,t5...tn) Such a complete statistical description of a random process re:uires the 7nowledge of all order distribution functions. <he above discussion regarding the random process and their distribution and densit# function further leads two some important classes of random processes which are discussed belowZ ,.1.1. ST ICT$@ STATIONA @ % OCESS 3 random process ' (t) is said to be stationar# if its statistical properties do not change with time.

Lore precisel#, a process X (t) is said to be strict sense stationar* 8SSS9 &rocess if the nth order Yoint distribution function is invariant under shifts of the time origin i. e. for n [ $, its nth order Yoint ESF statisfies the condition. F ('Zt) = F ('$,'5...'nZ t$,t5...tn) F ('Zt) = F ('$,'5...'nZ t$Qr,t5...tn Qr) .....Z(5.$$)

For all vectors ' \ 1n and for order n and all scalars r such that tY Q r \ <. 2ere, the notation tQr implies that the scalar r is added to all components of vector t. F(' Z t) =F(' Z tQ r) <he e'pected value of general random process X (t) is defined as

, m' (t) = E (t)!= '. ^(',t) d' 0, "n general, it is a time var#ing :uantit#. <he e'pected value m (t) is open called a first order statistic. Since, depends on the first order densit# function. "f we appl# the definition of strictl# stationar# process of the first order ESF, then we get F('Zt) <hen the mean , m' = E X (t)!= '. ^(') d' 0, "s constant and is time independent i.e. m' (t) = m' for all t \ <. ,.1.,. 7ide Sense Stationar* %rocess Xefore defining the wide sense stationar# process (;SS), we discuss about the auto0correlation function 1. <he auto0correlation function 1 provides a measure of dependence among the random variables of a random process. Lathematicall#, the auto0correlation function of X(t) is defined as( , 1XX(t$,t5) = E X(t$).X(t5)! = , = F('Zt Q r) or F'(t) = F' (t Q r) for all r

>

....(5.$6)

>

....(5.$A)

-' ' f(' ' Zt ,t )d' '


$ 5 $ 5 $ 5

$ 5

..(5.$%)

0, 0, "n general, 1 depends upon two time variables t$ and t5. 3lso, 1 is an e'ample of a second0 order statistics since it depends on the second0order densit# function. "f we appl# the definite of SSS process then the auto0correlation function of X (t) is defined as( , 1XX() = E X(t$).X(tQ )! = ,

- ' ' f(' ' Z )d' '


$ 5 $ 5

$ 5

..(5.$8)

0, 0, "t depends onl# the time difference =t2 ? t$ (does not depends upon absolute time). Gow, a random process is said to be wide0sense stationar# (;SS) if( $. 5. Lean mX (t) = E X(t)! is a constant mX i.e. independent of t. ...(5.$D) 3uto0correlation 1XX() = E X(t), X(tQ )! depends onl# on the time difference. "t ma# also be 1XX(t$.t5) = 1XX (9, t50t$) = 1XX (), t5 V t$ V 9 ...(5.$&)

written as(

6.

"t has finite second order moment i.e. 1(9) = E X5 (t)! Ue ....(5.$/)

Gote that, the strict sense stationar# process is wide0sense stationar#. 2owever, the converse is not true i.e. ;SS does not impl# strict0sense stationar#. ,.5 E 'O"IC % OCESS 3 random process X(t) is said to be ergodic if time averages are the same for all orders of 3 stationar# process X(t) is called ergodic in the mean if X = U '(t) V = E X (t)! = m' 1XX () = U '(t) '(t Q ) V = E X(t). X(tQ )! = 1XX() ....(5.59) ....(5.5$) Similarl#, a stationar# process X(t) is called ergodic in the auto0correlation if <hus an ergodic process is alwa#s stationar# but converse is not true. <he ergodicit# of an# random process ma# be determined b# evaluating statistical averages of single sample function. <his means that a single sample function represents entire random process. ,.5.1 %ro&erties of Er(odic andom %rocess <e sting of ergodicit# of a random process is usuall# ver# difficult. 3 reasonable assumption in the anal#sis of most communication signals is that the random waveforms are ergodic in mean and in auto correlation. <he dc level, root0mean0s:uare (rms) value and average power can be related to the moments of an ergodic random process, such the properties of ergodic process are summariOed in the following( $. 5. 6. A. %. X = U '(t) V is e:ual to dc level of the signal or the sample function. '!5 = U '(t) V5 is e:ual to the normaliOed power in the dc component. 1XX(9) = U '5(t) V is e:ual to the total average normaliOed power. BX5 = U '5(t) V 0U'(t)V5 is e:ual to the average normaliOed power in the time var#ing or BX is e:ual to the rms value of the ac component of the signal.

statistical averages and ensemble averages.

ac component of the signal.

Ill)strative E-am&le 1. Consider a random &rocess 48t9 is (iven 0* X (t) = 3 cos (f tQ09 7/ere A and 0 are constants and A is a random varia0le. "etermine 3/et/er 48t9 is a 3ide-sense stationar* &rocess or not. Sol)tion. 3 random process X(t) is called ;SS, if its mean is constant, i.e. m' (t) = E X(t)! = E 3 cos (f tQ0)! = cos (f tQ0)E 3! ;hich indicates that the mean of X(t) is not constant unless E 3! =9.

<he auto0correlation of X(t)( 1XX (t,tQ ) = E X(t) .X(tQ )! Pr Pr Pr 1XX (t,tQ ) = E 35 cos(f tQ0).cos= f(t Q )Q 0>! 1XX (t,tQ ) = cos(f tQ0).cos= f(t Q )Q 0>.E 35! $ 1XX (t,tQ ) = 5 <hus, we see that the auto0correlation of X( t) is not a function of time difference onl#. <herefore, the process X(t) is not ;SS. Ill)strative E-am&le ,. Consider a random &rocess 48t9 is (iven 0* 48t9 ?A cos A t B # sin A t 7/ere A is contantC and A and # are random varia0les. 8i9 S/o3 t/at t/e condition EDAE ? ED#E ?F Is necessar* for 48t9 to 0e stationar*. cos f Qcos(5 ftQ50Q f )!. E 35!

$esson -, ,.G CO E$ATION !UNCTION <he correlation f)nction provides a measure of similarit# between a given signal (Pr process) and a replica of the same signal or other signal (or process) b# a variable amount.

,.G.1. A)to-correlation !)nction A)to-correlation f)nction ma# be defined as a measure of similarit# between a Signal or process and its replica b# a variable amount. <he auto0correlation function of a stationar# process '(t) ma# be defined as

1''(t j 0 t i) = E ' (t j) ' (t i)! for an# t i and t j

...(5.55)

;here ' (t i) and ' (t j) are the random variables obtained b# observing the process ' (t ) 3t time t I and t j, respectivel#. 3lso the auto0correlation function depends onl# the time Sifference (t j - t i).

bsing = t j - t i in e:uation (5.55),we get 1'' () =E ' (t) ' (t )! ..(5.56)

<he e'pression for

'(t) and '(t 0 ) are viewed as random variables. <he variable N

is 7nown as time-

la( or time-dela* parameter. <he auto ?correlation function for stationar# random process is independent of a shift of time origin. %ro&erties of A)to < correlation f)nction of random &rocess

<he auto ? correlation function 1 '' () of a wide ?sense stationar# (;SS) process has the following important properties.
(i) <he auto0correlation function of ;SS process

'(t) is alwa#s an even f)nction of

time-la(.

i.e

1'' () = 1''( - )

..(5.5A)

%roof+ 3ccording to the definition of auto ?correlation function of ;SS random process,we have

1'' () =E ' (t) ' (t - N)! Substituting = 0 ,we have 1'' ( 0 ) = E X(t) X(t + )! Pr 1'' ( 0 ) = E X(t + ) X (t)!

....(5.5%)

For ;SS process,we 7now that X(t) X(t 0 N) = X (t + N ) X(t) ;here N is the time0lag parameter. From the above e:uations,we conclude that 1'' () = 1''( - ) (ii) <he mean s:uare value of a ;SS process is e:ual to the auto ?correlation funcation of the random process for ;ero time < la( i.e. =9 ..(5.58)

%roof+ Pr

1'' (9) = 1'' ()gg gH ? F 1'' (9) = E '(t) '(t ) HN ? F 1'' (9) = E '(t) '(t 9)! = E '5 (t)! 1'' (9) = E '5 (t)! ....(5.5D)

Pr Pr

(iii)

<he auto ?correlation function of ;SS random process has the ma-im)m ma(nit)de at ;ero time <la( ( = 9) i.e. H 1'' () Hma- = ma'imum magnitude of 1'' () = 1'' (9) ....(5.5&)

+roof( ;e 7now that the mean s:uare value of the difference between '(t) and X(t - ) is alwa#s non0negative,i.e. E ='(t)0 '(t )>5! h9 Pr E '5(t)0 5'(t) '( t )Q'5 ( t )! h9 ..(5.5/)

Since e'pectation operation is a linear operation ,we can write above e:uation as E '5(t) !0 5 E '(t) '( t ? )!Q E '5 ( t )! h9 3lso ,For a ;SS random process, we have E '5(t)!= E '5 ( t ? ) 3nd ..(5.69)

! 1'' (9)

. (5.6$(a)! .. 5.6$(b)!

E '(t) '( t ? )!= 1'' ()

Substituting e:uations (5.6$) in e:uation (5.69).we get H 1'' () HI 1'' (9) ..(5.65)

<hus, the auto0correlation function 1'' (N) provide means of decreasing the interdependence of two random variable b# observing a random process '(t) N second apart. <he auto ? correlation function is a measure of rate of the of the fluctuation of random process.

,.G.,. Cross-correlation !)nction


3uto ?correlation function is determined for single random process but cross0correlation function is determined for two random processes Eonsider two random processes '(t) and Y(t) then the cross0correlation0between the two random processes is defined as 1'Y (N)= E '(t) Y( t Q N)! ..(5.66)

%ro&erties of cross-correlation f)nction of random &rocesses <he cross0correlation function 1 'Y () of a ;SS process has the following important properties.i 1. <he cross0correlation function is not an even function of N li7e an auto0correlation function, but it is having following s#mmetr# relation( 1'Y () =1'Y (0) Pr 1'Y (-)= 1'Y () ... 5.66(a)! ... 5.66(b)!

%roof+ bsing e:uation (5.66),we have 1'Y (-)= E '(t) Y( t 0 N)!

Setting t- N = t,we obtain 1'Y (-)= E '(tQ ) Y( t j)! Pr 1'Y (-)= E Y(t)X( t` Q )!= 1Y' () ..(5.6A)

,. <he cross0correlation function does not hae a ma'imum at the origin li7e the auto0correlation

function. <he two random processes ' (t) and Y (t) are said to be inco/erent or ort/o(onal if cross ?correlation function of ' (t) and Y (t) is Oero. 1'Y () = 9

3. H 1'Y () H I J 1'' (9) 1YY (9)

..(5.68)

,.G.3. A)to Covariance !)nction


<he a)to covariance function for random process ' (t) ma# be defined as E'' () = E ='(t)0 E '(t)!> ='(t Q )0 E '(t + )!>!

.. (5.6D)

Pn solving e:uation (5.6D),we have E'' () = 1'' () 0 m5' .. (5.6&)

;here 1'' (N) is the auto ?correlation function of process ' (t) m5' is the mean s:uare value of a sample function ' (t) of random process ' (t).

,.G.1 Cross Covariance !)nction


<he cross covariance function for random processes ' (t) and Y (t) ma# be defined as( E'# () = E ='(t)0 E '(t)!> =Y(t Q )0 E Y(t + )!>! ..(5.6/) ..(5.A9)

or

E'# () = 1'Y ()0m'm#

where 1'Y (N) is the cross0correlation of random processes '(t) and Y(t),and m' and mY are the time averaged mean of sample function of '(t) and Y(t) respectivel#. <he following points must be worth noting(

(i)

<wo processes '(t) and Y(t) are called (mutuall#) orthogonal if 1'' () =9 .(5.A$)

(ii)

<wo processes '(t) and Y(t) are called uncorrelated if E'# () =9

Ill)strative E-am&le 3. <wo random processes '(t) and Y(t) are given b# '(t) = 3 cos ( t Q ) #(t) = 3 sin ( t Q ) where 3 and f are constants and k is a uniform random variable over 9,5l!. Find the cross0correlation of '(t) and Y(t). Sol)tion. <he cross0correlation of '(t) and Y(t) is

1'# (t,t Q ) = E '(t) Y(t Q )

= E 35 cos(t Q ) sin =f(t Q )>!

35
=

E sin(5t Q Q5 ) 0sin (0)!

35
= sin = 1'# ()

5
Similarl#, 1#' (t,t Q ) = E Y(t) X(t Q )!

= E 35 sin(t Q ) cos =(t Q )>!

35
=

E sin(5t Q Q5 ) Qsin (0)!

5
35 = 0 5 sin = 1Y' ()

35
3lso, 1'#(-) =

35 sin (0) = 0 5 sin

= 1'#() ;hich verifies the first propert# of cross0correlation.

,.K S%ECT A$ "ENSITIES


Spectral densities are used for the representation of the random process in fre:uenc# domain. So far, we have considered the characteriOation of stationar# random processes in time domain, the characteriOation of random processes in fre:uenc# domain is as follows(

5.D.$

%o3er S&ectral "ensit* Let 1''(N) be the auto0correlation function of '(t).<hen the &o3er s&ectral densit*

(Pr power spectrum) of a wide ?sense stationar# (;SS) random process X( t) is define b# the fourier transform of 1'' (N) as(
,

S''() = 1'' ()e ?YfN d


0,

>

...(5.A6

<hus ,we can sa# that the Fourier transform of auto0correlation function of random process is called the &o3er s&ectr)m densit* 8%S"9 of that random process.+SS is used to measure the energ# of the signal in the fre:uenc# band , +d !. %ro&erties of &o3er s&ectral densit* <he power spectral densit# S'' (f) and auto ?correlation function 1'' (N) of a stationar# process '(t) from a Fourier ?transform pair with N and f as variables of interest, as shown b# the pair of relations

S''() = 1'' ()e ?j dN


0,

>

.. 5.AA(a)!

$
0,

1''(N) = 5l S'' ()e j df

>

.. 5.AA(b)!

E:uations of (5.AA) are basic relation in the theor# of spectral anal#sis of random process, constituting together the# are usuall# called the Einstein-7iener-L/intc/ine rolations. ;e now use this pair of relations to drive some general properties of the power spectral densit# of a ;SS process as follow( $. <he ;ero freM)enc* val)e of the power spectral densit# of a stationar# process e:ual to the

total area under the graph of the auto ?correlation functionZ that is

S''(9) = 1'' () d
0,

>

..(5.A%)

%roof + <his propert# follows directl# from e:uation 5.AA(a)! as(

+utting is ?9 ,we have

S''(9) = 1'' ()e0j(9)N d


0,

>

= 1'' () d
0,

>

5.D.5 Cross %o3er S&ectral "ensit* <he power spectral densit# provides a measure of fre:uenc# distribution of a single random processZ cross power spectral densit# provides a measure of fre:uenc# inter0relationship between two random processes. Let X(t) and Y(t) be two Yointl# stationar# random processes with their cross0correlation functions denoted b# 1XY (N) and 1YX (N) respectivel#. <hen we define the cross power spectral densities S XY (f) and SYX m( f) of this pair of random processes to be the Fourier0transforms of their respective cross0 correlation function, as follows( , SXY (f)= 1XY (N)e0 Y f N d N 0, ,

.. 5.%$(a)!

0Y f N and SYX (f)= dN .. 5.%$(b)! 0, 1XY (N)e <he cross0correlation functions and cross power spectral densities thus from fourier0transfrom pairs, we have ,

1XY (N)= $ 5l

-S
0, ,

XY

(f)e0YnN d f

.. 5.%5(a)!

and

1YX (N)= $ 5l

-S
0,

YX

(f)e0YnN d f

.. 5.%5(b)!

<he cross power spectral densities SXY (f) and SYX (f) are not necessaril# real functions of the fre:uenc# f . 2owever, substituting the relationship 1XY (N)= 1YX (0N) "nto e:uation 5.%$(a)! and 5.%$(b)!, we find that SXY (f) and SYX (f) are related b#

SXY (f) = SYX (0f) = SYX (f)

..(5.%6)

,.K.3 Ener(* s&ectral "ensit* <he energ# spectral densit# oX (f) ma# be defined as a measure of densit# of the energ# contained in random process X(t) in poules per 2ertO. "t ma# be noted that amplitude spectrum of a real0valued random process X(t) is an even function of f , the energ# spectral densit# of such a signal is s#mmetrical about the vertical a'is passing though the origin. <hus, the total energ# of the random process X(t) is defined as
,

E=$ 5l

- o (f) d f
X 0,

.. 5.%A!

Ill)strative E-am&le 1.3 modulated random signal Y(t) is defined b# Y(t) =3X (t) cos (f c tQ9) ;here X(t) is the random massage signal and 3 cos(f c tQ9) is the carrier. <he random message signal X(t) is Oero0mean stationar# random process with auto0correlation 1 XX (N) and power spectrum SXX (f). <he carrier amplitude 3 and fre:uenc# fc are constants, the phase 9 is a random variable uniforml# distributed over 9,5l!. 3ssuming that X(t) and 9 are independentZ find that (a) mean, (b) auto0 correlation, and /c) power spectrum of Y(t). Sol)tion.(a)<he mean of Y(t) is given b# mY (t) =E Y(t)! = E 3 X (t)cos(f c tQ9) Since X(t) and 9 are independent, therefore mY (t) =3.E X(t)!E cos(f c tQ9)! 3s, the mean of random message signal X(t) is Oero, i.e. E X(t)! = 9 <herefore, mY(t) = 9 809 T/e a)to-correlation f)nction of @ (t) is 1YY (t,tQ N) = E Y(t) Y (tQ N)! =E 35 X(t) X (t Q N)cos(fc tQ9)cos= f c (t Q N) Q9>! =35E X(t)X(t Q N)!E cos fc N Q cos(5fct Q fct Q59)! 5 =35 1XX (N)cos fcN =1YY(N) 5

8c9 Since t/e mean of T8t9 is a content and t/e a)to-correlation f)nction of @8t9 de&ends onl* on t/e time difference N 6 3/ic/ means t/at @8t9 is a 3ide-sense stationar* &rocess. T/)s6 SYY(f) = F 1YY (N)! = F 1YY (N) cos fcN! 7e Lno3 t/at6 F 1YY(N)! = SXX (f) and F(cos fcN)=lq (f 0 fE)Q lq (f Q fE)

bsing, the fre:uenc# convolution theorem, we have SYY (f) = A Ill)strative E-am&le 5. Let us consider two ;SS processes X ( t) and Y (t) with Oero means. <he sum of random process is another random process denoted as J ( t) = X (t) Q Y (t) . Setermine the auto0 correlation function and power spectral densit# of the random process J ( t) in terms of X (t) ma# be determined as( 1 OO (N) = E J (t) J (t ? N) = E =X (t) Q Y (t)>=X (t ? N) Y (t ? N) > = E X (t) X (t ? N) Q X (t) Y (t ? N) Q X (t ? N) Y (t) Q Y (t) Y (t ? N) Pr Pr 1 OO (N) = E X (t) X (t ? r) Q E X (t) Y (t ? r) Q E X (t ? r) Y (t) Q E Y (t) Y (t ? r) = $ 5l 5 35 SXX (f)r lq (f 0 fE)Q lq (f Q fE)!

35 SXX (f 0 fE)Q SXX (f Q fE)!

1 OO (N) = 1 '' (N) Q 1 '# (N) Q 1 #' (N) Q 1 ## (N) <he power spectral densit# (+SS) of random process J ( t) ma# be obtained b# tal7ing

Fourier0transform of both sides of the auto0correlation function given above( 2ence, we have S OO (w) = S '' (w) Q S '# (w) Q S #' (w) Q S ## (w) <he power spectral densit# (+SS) of the sum of two ;SS random processes is e:ual to the sum of the individual power spectral densities plus the sum of the cross power spectral densities S '# (w) and S #' (w). "f random process X (t) and Y (t) are non0 correlated then the cross power spectral densities S '# (w) and S #' (w) are Oero. Gow, the +SS of J (t) ma# be re0 written as

S OO (w) = S '' (w) Q S ## (w) 2ence, the power spectral densit# (+SS) of the sum of two uncorrelated ;SS random processes is e:ual to the sum of their "ndividual power spectral densities. ,.O ES%ONSE O! $INEA S@STEMS TO THE IN%UT AN"OM % OCESSES Suppose that a random process X (t) is applied as input to a liner time0 invariant filter of impulse response h (t), producing a new random process Y (t) at the filter output, as in Fig. 5.5 here, we shall determine the mean and the auto0 correlation function of the output random process Y ( t) in terms of input random process X (t). 3lso we are assuming that X (t) is a wide0sense stationar# (;SS) process.

"nput random +rocess X (t)

"mpulse 1esponse h (t)

Putput random +rocess Y (t)

!i(. ,.,

Transmission of a random &rocess t/ro)(/ a liner s*stem es&onse

<he transmission of a random process through a linear s#stem is governed b# the convolution integral, we ma# e'press the output random process Y ( t) in terms of input random process X (t) as( , Y (t) =

- h (N) X (t0 N) d N
0, ;here N is the integration variable.

..(5.%%)

Lean and auto0correlation of the output <he mean of the output random process is e'pressed as( , m# (t) =E Y (t) = E

- h (N) X (t0 N) d N
0,

, or m# (t) =

- h (N) E X (t0 N) d N
0,

, or m# (t) =

- h (N) m
0,

'

(t0 N) d N

..(5.%8)

;hen the input random process X(t) is a wide0 sense stationar# process, the mean m' (t) is a constant m'. <herefore, e:uation (5.%8) ma# be rewritten as( , m# = ,
'

- h (N) m
0,

dN = m'

0,

h (N) dN

or

m#

= m' 2 (9)

..(5.%D)

;here 2 (9) = Oero fre:uenc# (SE) 1esponse of the s#stem E:uation (5.%D) states that the mean of the random process Y( t) produced at the output of the linear s#stem in response of X (t)Z acting as the input process is e:ual to the mean of X ( t) multiplied b# the SE response of the s#stem. ;e shall now determine the auto0correlation function of the output random process Y ( t). Let t and be two values of time at which output process is observed. 3uto0correlation function of output random process is defined as 1 YY (t, ) = E Y (t) Y ( ) ...(5.%&)

X# using convolution integral, e:uation (5.%&) can be written as , 1 YY (t, ) =E


$ $ $

,
5 5 5

- h (N ) X (t0 N )d N - h (N ) X ( 0 N )d N
0, 0,

..(5.%/)

2ere again provided that the mean0s:uare value E X5 (t) N5 in e:uation (5.%/), we have , 1 YY (t, ) =
$ $

is finite for all t and the s#stem is

stable, we man# interchange the order of e'pectation and the integrations with respect to N$ and

,
5 5

- h (N )d N -h (N ) d N
0, 0,

E X(t0 N$)X ( 0 N5)

, or 1 YY (t, ) =
$ $

,
5 5

- h (N )d N -h (N ) d N

1 '' (t0 N$, 0 N5)

..(5.89)

0, 0, ;hen the input X (t) is wide0 sense stationar# process, the auto0correlation function of X ( t) is onl# a function of the difference between the observation time t0 N$ and 0 N5 substituting N = tin e:uation (5.89), we have , , 1 YY (N) =

- -h (N ) h (N ). 1
$ 5

''

(N 0 N$Q N5) d N$d N5

...(5.8$)

0, 0, 1 YY (N) is the function of time lag N. Pn combining the result of the e:uation (5.8$) with e:uation (5.%D), we see that if the input to a stable linear time invariant s#stem is a ;SS process, then the output of the linear s#stem is also a ;SS process. Since 1 (9) = E Y5 (t) , it follows that the mean0 s:uare value of the output of random , , 1 YY (9) ;hich is a constant. %o3er S&ectral "ensit* of t/e o)t&)t <a7ing the Fourier0transform of the both sides of e:uation (5.8$), we obtain the power spectral densit# of the output. , S YY (w) = 1 YY (N)e0YwN d N 0, , , , = h (N$) h (N5). 1 '' (N 0 N$Q N5)e ?YwN dNdN$d N5 0, 0, 0, = 2(w) 5 S XX (w) ..(5.86) = E Y5 (t) =

YY

process Y (t) is obtained b# putting N =9 in e:uation (5.8$). ;e thus get the result(

- -h (N ) h (N ). 1
$ 5

''

(N5 0 N$)d N$d N5

...(5.85)

0, 0,

S YY (w)

---

or

S YY (w)

<hus, we obtain the important result that the power spectral densit# of the output is the product of the power spectral densit# of the input and the magnitude s:uared of the fre:uenc# response of the s#stem. ;hen the auto0 correlation of the output 1 (N) is desired, it is easier to determine the power

YY

spectral densit# S YY (w) and then to evaluate the inverse Fourier0 transform. <hus , 1YY (N)= $ SXY (f)eYnN d f 5l 0 , , Pr 1YY (N)= $ 5l

2(f)5 SXX (f)eYnN d f

..(5.8A)

0,

Ill)strative E-am&le G. A 7SS random &rocess 48t9/as t/e follo3in( a)to-correlation f)nction 1XX (N) = 3e0a r 7/ere A and a are real &ositive constants. It is &assed t/ro)(/ a $TI s*stem 3it/ im&)lse res&onse h(t) = e0btu(t) 3/ere 0 is a real &ositive constants. !ind t/e a)to-correlation f)nction of t/e o)t&)t @8t9 of t/e s*stem. Sol)tion. <he power spectral densit# of the output is given as SYY (f) = 2(f)5 SXX (f) ;here SXX (f) is the power spectral densit# of the input, e'pressed as , SXX (f) = 1XX (N)e0YnN d N 0, , = 3e 0 a r 0, ,

0 YnN

dN

=3 e(a0 Yn) N d N Q 3 e(a Q Yn) N d N

0, 1 a - j
+

09 1 a + j

2aA a2 + 2

and the fre:uenc# response 2 (f) of the s#stem is , , 2 (f)=

-h(N) 0
e

YnN

0, , Pr 2 (f)=

d N = e0bNu(N) e0 YnN d N 0,
N

- 0(bQYf)
e

dN =

9 So, 2 (f) 5= $ b Q Yf
5 =

$ b Q Yf $ b Q f5
5

<herefore, the +SS of the output Y(t) is SYY(f) = 2 (f) 5 SXX (f) $ 5a3 SYY(f) = b5 Q f5 f5 Q a 5 a3 5b 3 SYY(f) = 0 b(a5 ? b5) f5 Q b5 a 5 0 b5

5a f5 Q a 5

or

<a7ing the inverse fourier0transform on both sides of the above e:uation, we obtain the auto0 correlation function of the output Y (t) of the s#stem( a3 b(a5 ? b5)e ,.. 3 a 5 0 b5 e

or

1YY(N) =

0b N

0a N

S%ECIA$ C$ASSES O! AN"OM % OCESSES "n this, consider the important random processes that are commonl# encountered in the stud# of

communication s#stems. ,...1 'a)ssian %rocess Let us suppose that we observe a random process X( t) for an interval that starts at time t = 9 and lasts until t = <. Suppose also that we weight the random process X( t) b# some function g (t) over this observation time0interval (9U t U <). <hus, we obtain a random variable Y defined as( <

Y=

g(t) X (t) dt ...(5.8%)

;here Y is a linear functional of X( t). <he value of random variable Y depends on the course of the argument function g(t) X(t) over the entire observation interval from 9 to <. <herefore, a functional is a :uantit# that depends on the entire course of one or more functions rather than on a number of discrete variables. "f in e:uation (5.8%), the weighted function g(t) is such that the mean s:uare value of the random variable Y i.e. < E Y5 (t)! = E <
5 5

- - g (N) X (t ? N) dt d N
9 ...(5.88)

"s finite, and if the random variable Y is a Maussian distributed random variable for each value of function g(t) is 7nown as a Maussian process. "n other words, the process Y( t) is Maussian process if ever# linear functional of X(t) is a Maussian random variable.

You might also like