Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A 2 Digital Unit 5

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

JSS MAHAVIDYAPEETHA

JSS ACADEMY OF TECHNICAL EDUCATION, NOIDA


Approved by All India Council for Technical Education (AICTE), New Delhi.
UG programs are Accredited by National Board of Accreditation (NBA), New Delhi.
Affiliated to Dr APJ Abdul Kalam Technical University, Uttar Pradesh, Lucknow.
www.jssaten.ac.in

ELECTRONICS & COMMUNICATION ENGINEERING


ASSIGNMENT – 5.1 (Unit-5)

Faculty : Dr. A.K. Ahuja Semester : I


Subject : Digital Communications Subject Code : KEC-601
AY : 2022-23
Explain basic statistical analysis of random signals and processes in
C316.1 CO1
communication theory.
Illustrate data formatting and demonstrate the concepts involved in digital
C316.2 CO2 communication.

C316.3 CO3 Explain various digital carrier systems.


Analyse and compare the performance of matched filter with different
C316.4 CO4
modulation schemes and explain the different spectrum sharing techniques.
Assess information theory and different channel error-correcting and
C316.5 CO5 detecting method.

Q.
QUESTIONS BL
No.
1. A high resolution black & white picture consist of about 2*10 6 picture elements & 16 different L3
brightness levels. Pictures are repeated at the rate of 32 per sec. All picture elements are
assumed to be independent & all levels have equal likelihood of occurrence. Calculate the
average rate of information conveyed by this TV picture source.
2. A zero-memory source emits six messages with probabilities 0.3, 0.25, 0.15, 0.12, 0.10, 0.08. L3
Find 4-ary (Quaternary) Huffman code. Determine its average word length, the efficiency
and redundancy.

3. A zero-memory source emits six messages with probabilities 0.3, 0.25, 0.15, 0.12, 0.10, 0.08. L3
Find 4-ary (Quaternary) Shannon-Fano code. Determine its average word length, the
efficiency and redundancy.

4. An information source produces different symbols with probability ½, ¼, 1/8, 1/16, 1/32, L3
1/64, 1/128, 1/256 respectively. These symbols are encoded as 000,001,010,011,100,110 and
111 respectively.

(i) What is the amount of information per symbol?


(ii) What is the probability of occurrence for 0 and 1?
(iii) What is the efficiency of the obtained code?

5. What is mutual information and how it is related to channel capacity. What is Shannon’s L3
Channel Capacity for a Noisy Channel? For a standard voice band communication channel,
the signal to noise ratio is 30dB and transmission bandwidth is 3KHz. What will be the
Shannon limit for information in bits/sec?
6. For following Generator matrix, find the following, i) All possible (6,3) block code words, L3
ii) Draw the syndrome table, iii) show that it can detect and correct single error.

7. A rate 1/3 convolutional encoder has generator matrix g1= 100, g2=111, g3=101 L3
(i) Sketch the encoder.
(ii) Draw the code tree and trellis.
(iii) If input is 10110 determine the output?

8. Consider (7,4) cyclic code and generator polynomial g(x) = x3 + x2 + 1, i) Find the code vector L3
for d = 1101, ii) If e = 0100000, then show that it can detect and correct the single error.

9. What is Hamming Bound? Show that using a (7, 4) Cyclic code, only 1 error can be corrected. L2

10.
L3

Consider a Channel shown in Fig. Find

 Channel Matrix
 All possible types of Entropies, H(X), H(Y), H(XY), H(X/Y), H(Y/X)

You might also like