Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

IG Probab Process Recitation 4 Stoch Process

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

PROBABILITY AND STOCHASTIC PROCESSES

CHAPTER 4: INTRODUCTION TO STOCHASTIC PROCESSES

RECITATION 4.
PROPOSED BY PROF. L. DEBBI

In the exercices below, we work in a given probability space (Ω, F, P ).


Exercice 0.1 ([6]). A discrete-time random process is defined by Xn = sn , n ≥ 0 where
s ∼ U ((1, 2)).

(1) Sketch some sample paths of the process,


(2) Find the commulative distribution function of Xn ,
(3) Find the mean and the autocovariance functions of Xn ,
(4) Find the joint commulative distribution function of Xn , Xn+1 .
Exercice 0.2 ([6]). Let Y (t) = g(t − T ), where T ∼ Exp(α) and g is the 1-periodic
traingular waveform, i.e. g(t) = t, t ∈ [0, 1[ and g is periodic of period 1.

• Sketch the graph of g,


• Sketch some trajectories of the process ,
• Find the probability density of Y (t),
• Find the joint probability density function of Yt , Yt+d , (Consider 2 cases: d > 1 and
0 < d < 1),
• Find the mean mY (t) and the covariance CY (t, t + d), (Consider 2 cases: d > 1
and 0 < d < 1)
Exercice 0.3 ([5]). Let X := {X(t) = Z sin(t + Θ), t ∈ R}, where Z, Θ are independent
random variables and such that Z ∼ U ((−1, +1)) and P (Θ = ±π/4) = 1/2.

• Sketch some trajectories of the process,


• Show that X is wide stationary but not even first order stationary.
Exercice 0.4 ([5]). Let X1 := {X1 (t) t ∈ R} and X2 := {X2 (t) t ∈ R} be two wide
stationary random processes. We define the processes Y := {Y (t) t ∈ R} and Z :=
{Z(t) t ∈ R} by:
Y (t) := aX1 (t) + bX2 (t), Z(t) := X1 (t).X2 (t).
Prove that Y (t), Z(t) are wide stationary and find their correlation functions. Is the inde-
pendence assumption necessary.
Exercice 0.5 ([3]). Let X := {X(t) t ≥ 0} be a stationary independent increments process
such that X(0) = 0. Show that
1
2 L. DEBBI

• E(X(t)) = tE(X(1)),
• Cov(X(t), X(s)) = min(t, s)V ar(X(1)).
Exercice 0.6 ([6]). Noise impulses occur in a radio transmission according to a poisson
process with rate λ.

• Find the probability that no impulses occur during a transmission of a message that
is t second long.
• Suppose that the message is encoded so that the error caused by up to 2 impulses can
be corrected. What is the probability that a t second message can not be corrected.
Exercice 0.7. • Let X := (X(t), t ∈ R) be a Gaussian process prove that X is
stationary iff X is second order stationary.
• Prove that a Wiener process is Gaussian.
Exercice 0.8 (Partially[6]). A modem transmits a binary iid equiprobabledata sequence as
follows:

• To transmit a binary 1, the modem trasmits a rectangular pulse of duration T second


and amplitude 1
• To transmit a binary 0, the modem trasmits a rectangular pulse of duration T second
and amplitude -1

Let X(t) the process that results.

(1) Sketch 3 different samples of the process X(t) and make precise the corresponding
code or data sequence for trajectory,
(2) Show that X(t) can be represented as the sum of amplitude-modulated time shifted
rectangular pulses:
n=+∞
X
X(t) = An p(t − nT ),
n=−∞

where p is the rectangular pulse of duration T and (Ai )i is a sequence of independant


symmetric Bernoulli random variable taking values in {−1, 1}.
(3) Compute E(X(t)) and the autovariance function C(t1 , t2 ),
(4) Prove that ∀ m ∈ N0 , C(t1 + mT, t2 + mT ) = C(t1 , t2 ) and sketch a geometric
representation of the function C [We call wide sense cyclostationary],
(5) Is the process X wide-stationary,
Exercice 0.9 (Statistic-Appl). An experiment repeated independently n time gives the
samples X1 , X2 , ....Xn . We know that the experiment is modeled by a random variable
X ∼ N (m, σ). The aim is to determine estimate m and σ. The following quantity has been
proposed to approximate m:
n
1X
m̂ = Xk .
n
k=1
PROBABILITY AND STOCHASTIC PROCESSES 3

• Check numerically (use a machine) that the m̂ gives an estimation to m (take


n = 100),
• Justify analytically the proposition,
• Compute the expectation of m̂,
• What could you propose to get an estimation for σ.

Exercice 0.10 (Partially from [6]). Consider a linear combination of two sinusoids:

X(t) = A1 cos(αt + Θ1 ) + A2 cos( 2αt + Θ2 ),
where A1 , A2 are jointly Gaussian random variables and Θ1 , Θ2 are independent uniform
random variables in the interval (0, 2π). Assume that the amplitudes are indepedndent of
the phase random variables.

• Sketch three different samples of X(t),


• Find the mean and the autocorrelation function of X(t),
• Is X(t) wide stationary?
• Find the joint probability of X(t1 ) and X(t2 ). Could you get any deduction?

Exercice 0.11. Let (ti )i be a finite uniform partition of the interval [0, 2] with a mesh
equals to 0.1.

(1) Generate three simulations (realizations) of the Wiener process (W (t), t ∈ [0, 2]),
using the partition (ti )i ,
(2) Generate using the uniform law a random partition of 20 different points in [0, 2]
and use them to create an affine simultion for a process (X(t), t ∈ [0, 2]), which has
independent stationary increments following the standard Gaussian law.
(3) Use (2) to generate othe two simulations,
(4) Have you any deductions comments or remarks?

Exercice 0.12 (Partially from [6]). Let Xn and Yn be independent random processes (se-
quences). A multiplexer combines these two sequences into a combined sequence Uk , that
is,
U2n = Xn , U2n+1 = Xn .

(1) Rewrite more precisely the sequence Uk ,


(2) Assume that Xn and Yn are independent Bernoulli random processes.
• Compute the main first and second characteristics of Uk ,
• Under what conditions is Uk a stationary process
(3) Repeat (2) if Xn and Yn are independent stationary random processes,
(4) Assume that Xn and Yn are wide sense stationary processes. Is Un a wide sense
stationary process? Find the mean and the aucovariance functions of Un .

Exercice 0.13 ([6]). We consider a linear system described by the following differential
equation:
X 0 (t) + αX(t) = Z(t), t > 0, X(0) = 0.
4 L. DEBBI

The solution X(t) may represent the voltage across the capacitor of an RC ciruit with
current input Z(t). One way to make this system random is to consider a random input.
So let us assume that Z ∼ N (mZ (t), σZ (t)).

• Deduce that X(t) is Gaussian,


• Find the corresponding differential equations and the initial conditions for mX (t) :=
E(X(t)), RZX (t1 , t2 ) := E(Z(t1 )X(t2 )) (with respect to t2 ), RX (t1 , t2 ) := E(X(t1 )X(t2 ))
(with respect to t1 ).
• Find the mean and the autocorrelation functions of X(t).
Exercice 0.14. Let W := {Wt , t ≥ 0} be a Wiener process.

• Recall the definition of W ,


• Compute the mean mW and the autocorrelation function RW ,
• Prove that RW has no partial derivatives,
• Use the Dirac measure to give a sense to the partial derivatives,
• We denote by Ẇ (t) the Gaussian process (in general sense) that has zero mean and
2
autocorrelation function RẆ (t1 , t2 ) := ∂t∂1 ∂t2 RW (t1 , t2 ). We call Ẇ (t) the gaussian
white noise. Find the integral of the autocorrelation function of the white noise
Ẇ (t),
• Deduce at least one feature of Ẇ (t) making it a good candidate for some modellings.

References
[1] Craig A, Mackean et al Introduction to Mathematical statistics-Pearson, 2019.
[2] Grimmett D and Stirzaker G Probability and Random processes. Oxford Univ Press, Fourth Edition
2020.
[3] Hsu H. Probability, Random variables and random processes. ScHaum’s Outlines, 2011.
[4] Krishnan V. Nonlinear filtering and smoothing John Wiley & Sons, 1984.
[5] Larson H. J and Shubert B. O. Probabilistic models in engineering sciences. V1+V2 John Wiley &
Sons, 1979.
[6] Leon-Garcia L. Probability, Statistics and Random Processes for Electrical Engineers 3rd Edition Pear-
son 2008.
[7] Montgomery C. D. and Runger C. G. Applied Statistics and Probability for Engineers. Wiley, 2014.

You might also like