Itc Class1
Itc Class1
Itc Class1
UNIT-1
1. Introduction
2. Elements of Digital Communication
3. Measure of information
4. Average information Content(entropy) in long independent sequences
INTRODUCTION :
The block diagram of an information system can be drawn as shown in figure. the
meaning of the word "information "in information theory is "message" or "intelligence".
This message may be an electrical message such as voltage, current or power or speech
message • picture message such as television or music message. A source which produces
these messages is called "information source".
Information is the measure of the predictability and complexity of a message
ELEMENTS OF DIGITAL COMMUNICATION
FATHER OF INFORMATION THEORY
In order to know and compare the "information content" of various messages produced by an information
source, a measure is necessary to quantitatively know that information content. For this, let us consider an
information source producing independent sequence of symbols from source alphabet S = {s1, s2,… sq} with
probabilities P = {p1, p2, …pq} respectively.
Let SK be a symbol chosen for transmission at any instant of time with a probability equal to pK. Then the
"Amount of Information" or "Self-Information" of message SK (provided it is correctly identified by the
receiver) is given by
1K = log(1/PK)
If the base of the logarithm is 2, then the units are called "BITS", which is the short form of "Binary Units".
If the base is ''10", the units are "HARTLEYS" or "DECITS". If the base is "e", the units are "NATS" and if the
base, in general, is "r", the units are called "r-ary units".
The most widely used unit of information is "BITS" where the base of the logarithm is 2. Throughout this
hook, log to the base 2 is simply written as log and the units can be assumed to the bits, unless or otherwise
specified.
JUST TO GET THE CONCEPT OF THE “AMOUNT OF
INFORMATION”,LET US CONSIDER THE FOLLOWING EXAMPLES
1. The sun rise from the east, I=0,P=1 universal truth that is already known
2. There will be earth quake in Shankarghatta, I=1,P=0 Almost Impossible event
From the above example, we can conclude that the message associated with an event least likely to occur
contains most information. This statement can be confirmed with another numerical example.
Example 1.1 : The binary symbols '0' and '1' are transmitted with probabilities ¼ and ¾ respectively. Find
the corresponding self-informations.
Solution
Logarithmic expression is chosen for measuring information because of the follow reasons:
1. The information content or self-information of any message cannot be negative. Each message
must contain certain amount of information.
2. The lowest possible self-information is "zero" which occurs for a sure event since P (sure event) =
1.
3. If the information is more than probability is less.
4. If the information is less, probability is high.
5. If the receiver knows the information being transmitted then the message carried is zero.
PROPERTIES OF INFORMATION: continued
6. If the source emits two independent messages m1 & m2 are transmitted with the information
I1 & I2 respectively, than the total information is equal to sum of individual information.
Let I1 = log2(1/P(m1)
I2= log2(1/P(m2)
Itotal= log2(1/P(m1m2)
= log2(1/P(m1m2)
= log2(1/P(m1) +log2(1/P(m2)
Itotal = I1 + I2
ZERO-MEMORY SOURCE:
P2L number of messages of type S2, and PqL number of messages of type Sq.
From equation of measure of information, the self information of S 1= I1 = log(1/ P1) bits
:
AVERAGE INFORMATION CONTENT (ENTROPY) OF
SYMBOLS IN LONG INDEPENDENT SEQUENCES
continued….
Itotal=P1Llog(1/P1)+P2Llog(1/P2)+……. +PqLlog(1/Pq)bits
=Llog(1/Pi)bits.
Average self-information=
=log(1/Pi)bits
Average self-information is also called ENTROPY is given by
H(S) =log(1/Pi) bits/message symbol
Thus H(S) represents the “average uncertainty” or the “average amount of surprise”of the source.
INFORMATION RATE:
If the time rate at which source X emits symbols, is r (symbol s), the information rate R of the source
is given by R = rH(X) b/s
or
Product of average self information content per symbol and the rate of symbols emitted by the source
at a fixed time rate “rs” symbols per second
The average source information rate Rs in bits/sec is
Rs =H(S) rs bits/sec or BPS
Here R is information rate.
H(S) is Entropy or average information and r is rate at which symbols are generated.
Information rate is represented in average number of bits of information per second. It is
calculated
as under: symbols information bits R r in x H(X)in second symbol or R = information
bits/second