You are on page 1of 8

K.L.

UNIVERSITY

B.TECH COURSE HANDOUT

Branch and Year Semester Subject Subject Code

: :

III/IV ECE, 2011-2012 V

: Information Theory & Coding :

Course coordinator: V.V.S.MURTHY Course instructors: v.v.s.murthy, s.ganesan

KL UNIVERSITY FIFTH SEMESTER 2011-12 Course Handout Academic Division


Dated: 05-07-2011

Course Code Course Title Course Structure

: : : INFORMATION THEORY & CODING 3-0-0 Mr. V.V.S.Murthy : Mr. V.V.S.Murthy & S.Ganesan

Course coordinator: Instructors

1. Course Description:
This course will be part of a thorough introduction to information theory. Topics will include information measures, data compression to the entropy limit (source coding) and various source coding algorithms and differential entropy, maximum entropy, rate-distortion theory, alternating minimization, variational inference, and information geometry in general, including computing the capacity of a channel, network information theory, and more on the Gaussian channel. Key idea: The movements and transformations of information, just like those of a fluid, are constrained by mathematical and physical laws. These laws have deep connections with: Probability theory Thermodynamics (statistical physics) Spectral analysis, Fourier (and other) transforms Sampling theory, prediction, estimation theory Electrical engineering (bandwidth; signal-to-noise ratio) Error detection and correction (LBC, convolution codes) Signal processing, representation, compressibility
2.

Scope and Objective of the Course:

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms.

This course covers the basics of information theory and coding. The two fundamental questions that information theory considers are: 1. What is the ultimate data compression? (answer: the entropy of the data, H, is its compression limit.) 2. What is the ultimate transmission rate of communication? (answer: the channel capacity, C, is its rate limit.) All communication schemes lie in between these two limits on the compressibility of data and the capacity of a channel. Information theory can suggest means to achieve these theoretical limits. But the subject also extends far beyond communication theory. By the end of this course we will have answered both questions, as well as provided some basic guidelines for designing practical systems that come close to the fundamental limits. The course begins by defining the fundamental quantities in information theory: entropy and mutual information. Then we consider data compression (source coding), followed by reliable communication over noisy channels (channel coding). The final topic of the course will be rate distortion theory (lossy source coding).

Objectives
At the end of the course students should be able to calculate the information content of a random variable from its probability distribution relate the joint, conditional, and marginal entropies of variables in terms of their coupled probabilities define channel capacities and properties using Shannon's Theorems construct efficient codes for data on imperfect communication channels generalize the discrete concepts to continuous signals on continuous channels design the encoder and decoder for different channel coding methods

3. Books:
TEXT BOOKS: 1. K. Sam Shanmugam, Digital and analog communication systems, John Wiley, 1996. 2. Simon Haykin, Digital communication, John Wiley, 2003. REFERENCE BOOKS: 1. Ranjan Bose, ITC and Cryptography, TMH, II edition, 2007 2. Glover and Grant, Digital Communications, Pearson Ed. 2nd Ed 2008

4. Syllabus:
UNIT - 1 INTRODUCTION, Measure of information, Average information content of symbols in long independent sequences, Average information content of symbols in long dependent sequences. Mark-off statistical model for information source, Entropy and information rate of mark-off source. UNIT - 2 ENCODING OF THE SOURCE OUTPUT, Shannons encoding algorithm. Communication Channels, Discrete communication channels, Continuous channels. Source coding theorem, Huffman coding, Discrete memory less Channels, Mutual information, Channel Capacity UNIT - 3 CHANNEL CODING THEOREM, Differential entropy and mutual information for continuous ensembles, Channel capacity Theorem, Introduction to Error Control Coding, Types of errors, examples, Types of codes, Linear Block Codes: Matrix description, Error detection and correction, Standard arrays and table look up for decoding. UNIT - 4 BINARY CYCLE CODES, Algebraic structures of cyclic codes, Encoding using an (n-k) bit shift register, Syndrome calculation and BCH codes. UNIT - 5 RS CODES, Golay codes, Shortened cyclic codes, Burst error correcting modes. Burst and Random Error correcting codes. Convolution Codes, Time domain approach. Transform domain approach.

5. Course Plan:
Lecture No Learning Objectives Topics to be covered Reference Page No

UNIT-I Understand fundamental Measure of information concepts of mathematical theory of communication, logical discussion with measure of information with relating to uncertainty. Analyze the concept of entropy and its importance to measure uncertainty. Derive the condition of obtaining maximum value of entropy and evaluate the extension of a Discrete memory less source. Average information content of symbols in long independent sequences Average information content of symbols in long dependent sequences

T1, CHAPTER-4

140

T1, CHAPTER-4

142

T1, CHAPTER-4

145

4 5

Apply the concept of long dependent sequence. Design average information content of long dependent sequences and calculate entropy. Evaluate rate of information of mark-off chain.

Mark-off statistical model for information source Entropy of mark-off source. Information rate of mark-off source. UNIT-II Encoding of the source output

T1, CHAPTER-4 T1, CHAPTER-4

146 150

T1, CHAPTER-4

151

10

11 12

13

Understand how the source input is modified using different encoding methods and provide the secure communication between the transmitter and receiver. Understand the Shannon source encoding method and calculate the entropy for different message signals. Study the different communication channels and how the information passes through the communication channel Analyze the different communication channel and calculate the rate at which the information can be transmitted Calculate the capacity of discrete memory less channels. Analyze the difference between memory and memory less channels. Analyze the difference between the continuous and discrete channel.

T1, CHAPTER-4

156

Shannons algorithm

encoding T1, CHAPTER-4

157

Communication Channels

T1, CHAPTER-4

162

Rate of information T1, CHAPTER-4 transmission over a discrete channel Capacity of discrete memory T1, CHAPTER-4 less channels Discrete channels with T1, CHAPTER-4 memory Continuous channels T1, CHAPTER-4

167

170 173

174

14

15

Calculate the entropy for the Source coding theorem different source symbols using source coding theorem. Understand the Huffman source Huffman coding encoding method and calculate the entropy for different message signals and analyze the difference between Huffman and Shannon fano coding. Analyze the mutual information Mutual information between the two signals with different probabilities.

T2, CHAPTER-2

20

T2, CHAPTER-2

25

16

T2, CHAPTER-2

31

17

Channel capacity calculation for the various channels and find the maximum capacity. Understand the difference between the source coding and channel coding theorem. Analyze the differential entropy for the continuous channels using the probability density function. Analyze the mutual information signal using the integral function.

Channel capacity

T2, CHAPTER-2

35

18

UNIT-III Channel coding theorem

T2, CHAPTER-2

36

19

20

21

22

23

Understand the concept of T2, CHAPTER-2 channel capacity theorem and find the difference between the source and channel capacity theorem. Understand the concept of various Introduction to Error Control T1, CHAPTER-9 errors occur in the Coding communication channel and the types of error controlling method available to correct errors. Study the concept of single bit Types of errors T1, CHAPTER-9 error and two bit error occur in communication channel. Understand the concept of Linear block code to correct the single bit. Analyze the message signal input using linear block code. Introduce the syndrome calculation to calculate the error at the receiving end to find the exact reproduction of the input signal. Compare the received signal with the standard lookup table in order to find out the exact bit error. Types of codes T1, CHAPTER-9

Differential entropy for continuous ensembles mutual information for continuous ensembles Channel capacity Theorem

T2, CHAPTER-2

42

T2, CHAPTER-2

42

45

443

447

24

448

25 26

Matrix description of Linear T1, CHAPTER-9 block codes Error detection and correction T1, CHAPTER-9 capabilities of LBC

450 454

27

Standard arrays and table look up for decoding

T1, CHAPTER-9

458

28 29

30 31

UNIT-IV Understand cyclic codes as sub Introduction to Binary cyclic block of linear block codes. codes Understand concept of generator Algebraic structures of cyclic string by which a cyclic code can codes be completely described. Design the encoder for the given Encoding using an (n-k) bit n, k values. shift register Calculate syndrome vector for Syndrome calculation cyclic codes.

T1, CHAPTER-9 T1, CHAPTER-9

461 462

T1, CHAPTER-9 T1, CHAPTER-9

465 468

32

33

Analyze the capabilities of detecting and correcting errors using syndrome vector in cyclic codes. Apply the generator polynomial using the roots and generate cyclic code Analyze the RS code using time domain and frequency domain analysis. Analyze the golay codes by given polynomial and matrix representation. Design shortened cyclic codes using shift register. Understand the concept of burst error occurrence and how it can be corrected using correction method. Understand the concept of burst and random error and introduce the random error in the received signal. Understand the concept of convolution coding and Design the convolution encoder using shift register for the given polynomial. Design the convolution decoder to get back the original signal transmitted. Compare the performance of the different channel coding methods with the convolution codes. Design the tree and trellis code mechanism for the given input message bits. Design the viterbi decoder for the received message bits and find the error bits using decoding procedure.

Error detection and error correction

T1, CHAPTER-9

468

Special Classes of cyclic codes: BCH codes UNIT-V Reed-Solomon CODE

T1, CHAPTER-9

471

34

Ref-1, CHAPTER-5 Ref-1, CHAPTER-4 Ref-1, CHAPTER-4 T1, CHAPTER-9

174

35

Golay codes

147

36 37

Shortened cyclic codes Burst error correcting modes.

144 473

38

Burst and Random Error codes

T1, CHAPTER-9 correcting

476

39 40

Convolution Codes- T1, CHAPTER-9 introduction Encoders for convolutional T1, CHAPTER-9 codes Decoders for convolutional T1, CHAPTER-9 codes Performance of Convolutinal T1, CHAPTER-9 Codes Tree codes and trellis codes Ref-1, CHAPTER-6 of Ref-1, CHAPTER-5

478 479

41

480

42

486

43

190

44 45

Viterbi Decoding convolutinal codes

207

6. Self learning material:


S.no 1 2 Unit II III Topic The Shannon limit Perfect codes Source Ref-1, CHAPTER-2,pg-71 Ref-1,CHAPTER-3,pg-112

7. Evaluation Scheme:
Component Test-1 (20) Test-2 (20) Assignment Test (20) Duration (minutes) 50 50 50 Ultimate Weightage 10% 10% 10% Marks 100 100 100 Date & Time Venue

Quiz ( 20) 30 Comprehensive 180 exam (100) Attendance for theory classes(5)

5% 60% 5%

100 100 100

8. Chambe r consultation hour: Informed in the class in first week. 9. Notices: All notices regarding the course will be put in E- learning website.

Course Coordinator

You might also like