Generate Two Correlated Noise
Generate Two Correlated Noise
Generate Two Correlated Noise
Lab II
Since E[X] = X is a constant vector, CX = RX E[X]X X E[X ] + X X . Example: Find the expected value E[X], the correlation matrix, RX , and the covariance matrix CX of the 2-dimensional random vector X with PDF fX (x) = 2 0 1 0 x1 x2 1, otherwise
E[Xi ] =
The integrals are E[X1 ] = 1/3 and E[X2 ] = 2/3, so that X = E[X] = [1/3 2/3] . The elements of the correlation matrix are
2 E[X1 ] 2 E[X2 ]
1 0 1 0 0 1 0 0
x2
x2
E[X1 X2 ]
2 2 These integrals are E[X1 ] = 1/6, E[X2 ] = 1/2, and E[X1 X2 ] = 1/4. Therefore,
RX = Then, the covariance matrix can be calculated as CX = RX X X = 1/6 1/4 1/4 1/2
1/6 1/4
1/4 1/2
1/9 2/9
2/9 4/9
1/18 1/36
1/36 1/18
MATLAB: Generate Gaussian random vectors, X = [X1 X2 X3 ] with mean zero and variance 2 = 1, 4,and 9 for X1 , X2 , X3 , respectively, and estimate the correlation matrix. N=100000; X1=sigma1*randn(1,N); X2=sigma2*randn(1,N); X3=sigma3*randn(1,N); R(1,1)=mean(X1.2); R(1,2)=mean(X1.*X2); R(1,3)=mean(X1.*X3); R(2,1)=mean(X2.*X1); R(2,2)=mean(X2.2); R(2,3)=mean(X2.*X3); R(3,1)=mean(X3.*X1); R(3,2)=mean(X3.*X2); R(3,3)=mean(X3.2); R R = 1.0073 0.0026 -0.0070 0.0026 4.0102 0.0210 -0.0070 0.0210 8.9729
As expected, Xi for i = 1, 2, 3 are independent each other, the cross-correlation between the random variable (component of X) is near to zero while the diagonal components are close to the variances of X1 , X2 , X3 . For the calculation of covariance matrix, you may simply use the built-in function cov: X=[X1 X2 R=cov(X) R = 1.0073 0.0026 -0.0070 X3]; % N by 3
Now, the question is how we can generate the correlated random vectors. For the generation of correlated random vectors, we need a little background on the linear algebra, and it is described below. Theorem Given the n-dimensional standard normal random vector Z, an invertible n n matrix A, and an n-dimensional vector b, X = AZ + b is an n-dimensional Gaussian random vector with expected value X = b and covariance matrix CX = AA . Theorem For a Gaussian vector X with covariance CX , there always exists a matrix A such that CX = AA . Proof: To verify this fact, we connect some simple facts: CX : positive semidenite covariance matrix (i.e., every eigenvalue of CX is nonnegative). The denition of the Gaussian vector PDF requires the existence of CX 1 . Hence, for the Gaussian vector X, all eigenvalues of CX are nonzero (from the above, positive). Since CX is a real symmetric matrix, it has a singular value decomposition (SVD) CX = UDU where D =diag[d1 , n ] is the diagonal matrix of eigenvalues of CX . Since each di is positive, we ,d can dene D1/2 =diag[ d1 , , dn ], and we can write CX = UD1/2 D1/2 U = UD1/2 We see that A = UD1/2 . MATLAB: Generate Gaussian random vectors, having a specied mean value mx and covariance Cx mx=[0 0]; Cx=[1 1/2;1/2 1]; [U D]=eig(Cx); % if you multiply U*D*U, then you get Cx A=U*sqrt(D); Z=randn(2,N); % Z=[z1 z2] % where z1 and z2 are standard gaussian random variable. UD1/2
As expected, the estimated covariance matrix is very close to the covariance of what you want to obtain.
Autocorrelation
In the previous section we generated the correlated random vectors and estimated their covariance. In this section we generate a sample function of the random process, which is stationary, and estimate the autocorrelation and power spectrum. Generate a discrete-time sequence N = 1000 i.i.d. uniformly distributed random numbers in the interval ( 1 , 1 ) and compute the estimate of the autocorrelation of the sequence {Xn }. 2 2 x (m) = = 1 N m
N m
Xn Xn+m ,
n=1 N
m = 0, 1, ..., M m = 1, 2, ..., M
1 N |m|
Xn XN +m ,
n=|m|
echo on N=1000; M=50; Rx_av=zeros(1,M+1); for j=1:10, X=rand(1,N)-1/2; % Take the ensemble average over ten realization % N iid uniformly distributed random variables % between -1/2 and 1/2 phi=phi_est(X,M); % autocorrelation of the realization phi_av=phi_av+phi; % sum of the autocorrelation echo off; end; echo on; phi_av=phi_av/10; % ensemble average autocorrelation plot([0:1:M],phi_av); xlabel(m) ylabel(\phi(m)) title(Autocorrelation function)
function [phi]=phi_est(X,M) % [phi]=phi_est(X,M) % phi_est estimates the autocorrelation of the sequence of % random variables given in X. Only phi(0), phi(1),...,phi(M) % are computed. % Note that phi(m) actually means phi(m-1). N=length(X); phi=zeros(1,M+1); for m=1:M+1, for n=1:N-m+1, phi(m)=phi(m)+X(n)*X(n+m-1); end; phi(m)=phi(m)/(N-m+1); end;
Lab Homework 2
1. Consider the Gaussian random vector X = [X1 , X2 , X3 ] with all the zero mean E[X] = [0 1 2] and the variance Var(X) = [1 exp() exp(2)], where is a parameter and for this problem set to 0.5. The covariance matrix of X is given as 1 0.6 0.2 . 0.2 CX = 0.2 exp() (1) 0.6 0.2 exp(2) Generate 100000 random variables for X and estimate the covariance matrix then compare your estimated value with (1).