ELEC3028 (EL334) Digital Transmission: Building 53
ELEC3028 (EL334) Digital Transmission: Building 53
ELEC3028 (EL334) Digital Transmission: Building 53
S Chen
S Chen
S Chen
GSM/HSCSD
GSM
GSM/GPRS
MBS GSM/EDGE
ISDN
144 k 384 k 9.6 k 64 k 2M
Some improved 2G, HSCSD: high-speed circuit switched data, GPRS: general packet radio service, EDGE: enhanced data rates for GSM evolution. Also, HIPERLAN: high performance radio local area network
20 M
S Chen
input
channel
output
The transmission will be based on digital data, which is obtained from (generally) analogue quantities by 1. sampling (Nyquist: sampling with at least twice the maximum frequency), and 2. quantisation (introduction of quantisation noise through rounding o) Transmitting at certain rate requires certain spectral bandwidth Here channel means whole system, which has certain capacity, the maximum rate that can be used to transmit information through the system reliably
4
S Chen
Input/output are considered digital (analogue sampled/quantised) CODEC, MODEM, channel (transmission medium) Your 3G mobile phone, for example, contains a pair of transmitter and receiver (together called transceiver), consisting of a CODEC and MODEM
S Chen
What is Information
Generic question: what is information? How to measure it (unit)? Generic digital source is characterised by: Source alphabet (message or symbol set): m1, m2, , mq Probability of occurrence (symbol probabilities): p1, p2, , pq e.g. binary equiprobable source m1 = 0 and m2 = 1 with p1 = 0.5 and p2 = 0.5 Symbol rate (symbols/s or Hz) Probabilistic interdependence of symbols (correlation of symbols, e.g. does mi tell us nothing about mj or something?) At a specic symbol interval, symbol mi is transmitted correctly to receiver What is amount of information conveyed from transmitter to receiver? The answer: 1 I(mi) = log2 = log2 pi (bits) pi
S Chen
Concept of Information
Forecast: tomorrow, rain in three dierent places: 1. Raining season in a tropical forest 2. Somewhere in England 3. A desert where rarely rains Information content of an event is connected with uncertainty or inverse of probability. The more unexpected (smaller probability) the event is, the more information it contains Information theory (largely due to Shannon) Measure of information Information capacity of channel coding as a means of utilising channel capacity
S Chen
Shannon Limit
We know dierent communication system designs achieve dierent performance levels and we also know system performance is always limited by the available signal power, the inevitable noise and the need to limit bandwidth What is the ultimate performance limit of communication systems, underlying only by the fundamental physical nature? Shannons information theory addresses this question
Shannons theorem: If the rate of information from a source does not exceed the capacity of a communication channel, then there exists a coding technique such that the information can be transmitted over the channel with arbitrarily small probability of error, despite the presence of noise
In 1992, two French Electronics professors developed practical turbo coding, which approaches Shannon limit (transmit information at capacity rate)
S Chen
Information Content
Source with independent symbols: m1, m2, , mq , and probability of occurrence: p1 , p2 , , pq Denition of information: amount of information in ith symbol mi is dened by 1 I(mi) = log2 = log2 pi pi Note the unit of information: bits ! Properties of information Since probability 0 pi 1, I(mi) 0: information is nonnegative If pi > pj , I(mi) < I(mj ): the lower the probability of a source symbol, the higher the information conveyed by it I(mi) 0 as pi 1: symbol with probability one carries no information I(mi) as pi 0: symbol with probability zero carries innite amount of information (but it never occurs)
9
(bits)
S Chen
Physical Interpretation
Information content of a symbol or message is equal to minimum number of binary digits required to encode it and, hence, has a unit of bits Binary equiprobable symbols: m1, m2 0, 1, minimum of one binary digit (one bit) is required to represent each symbol Equal to information content of each symbol: I(m1) = I(m2) = log2 2 = 1 bit Four equiprobable symbols: m1, m2, m3, m4 00, 01, 10, 11 minimum of two bits is required to represent each symbol Equal to information content of each symbol: I(m1) = I(m2) = I(m3) = I(m4) = log2 4 = 2 bits In general, q equiprobable symbols mi, 1 i q, minimum number of bits to represent each symbol is log2 q Equal to information content of each symbol: I(mi) = log2 q bits Use log2 q bits for each symbol is called Binary Coded Decimal Equiprobable case: mi, 1 i q, are equiprobable BCD is good Non-equiprobable case ?
10
S Chen
Ii =
q X i=1
pi N log2
1 pi
(bits/symbol)
11
S Chen
Entropy
Memoryless source entropy is dened as the average information per symbol:
H =
q X i=1
(bits/symbol)
If each symbol is encoded by log2 q bits, i.e. binary coded decimal, average output bit rate is Rs log2 q. Note information rate R is always smaller or equal to the average output bit rate of the source1 ! source: alphabet of q symbols
1
symbol rate Rs
E
Hint: H log2 q
12
S Chen
Summary
Overview of a digital communication system: system building blocks Appreciation of information theory Information content of a symbol, properties of information Memoryless source with independent symbols: entropy and source information rate
13