You are on page 1of 15

INFORMATION THEORY

AND
CODING

Nitin Mittal
Head of Department
Electronics and Communication Engineering
Modern Institute of Engineering & Technology
Mohri, Kurukshetra

BHARAT PUBLICATIONS
135-A, Santpura Road, Yamuna Nagar - 135001
© Reserved with the publisher

All rights reserved. No part of this publication may be reproduced, stored in a retrieval
system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording
or otherwise, without the prior permision of the publisher.

Dedicated to our Beloved Parents

First Edition: Aug. 2009


Second Edition: June 2010

Price: Rs. 200.00

ISBN-13: 978-81-909129-3-8

Publishing by: Bharat Publications,


135 -A, Santpura Road, Yamuna Nagar - 135001
Phone: 01732-227178, 232178, 94162-27140

Laser setting by: Chanda Computers, 9896471565

Legal Warning: The publisher has taken all possible precautions in publishing this book, in spite of all this efforts some errors might have crept in. Any mistake, error
or discrepancy noted may be brought to our knowledge which shall be taken care of in the next edition. It is notified that neither the publisher nor the book seller or the author will
be responsible for any damage or loss of action to anyone, of any kind, in any manner, therefrom. For missing pages or misprints the publisher takes responsibility to exchange
within 15 days of purchase of the same edition. All costs in this connection are to be borne by the purchaser.
PREFACE

The present volume is the outcome of my experience in teaching of the information


theory to the undergraduate classes for the last few years and the exposure to the problems
faced by the students in grasping the abstract nature of the subject. This experience is the
foundation and, I hope, the strength of the text. Earnest efforts have been exerted to present
the subject matter in a well–knit manner so as not only to stimulate the interest of the students
but also to provide them with an insight into the complexities of a subject of great intrinsic
beauty.
The book is intended to serve as a text for undergraduate students especially those
opting for a course in Electronics and Communication Engineering. However, post graduate
students will find it equally useful.
This book offers a comprehensive review of the Information Theory and Coding. The
main text can be divided into four sections on Probability Theory, Information Theory, Source
Coding and Error Control Coding. Fairly sufficient ground has been covered in all the four
sections. Information theory is the study of achievable bounds for communication and is largely
probabilistic and analytic in nature. Coding theory then attempts to realize the promise of these
bounds by models which are constructed through mainly algebraic means. Different concepts
have been explained with the help of examples. A large number of problems with solutions have
been provided to assist one to get a firm grip on the ideas developed. There is a plenty of scope
for the reader to try and solve problems at his own in the form of exercises.
I am deeply indebted to all those authors whose research paper on Information
Theory and Coding influenced my learning of the subject and take this opportunity to express
my sincere gratitude to them. I am thankful to Dr. Rajesh Goel (Principal, MIET, Mohri);
Mr. R.S. Chauhan (Assistant Prof., JMIT, Radaur); Mr. Vikas Mittal (Sr. Lect. in HEC, Jagadhri)
and Mr. Amanjeet Panghal (Lect. in MIET, Mohri) for motivation and continuous encouragement
during the preparation of this manuscript. I also wish to thank my collegues and friends who
have given many valuable suggestions on the scope and contents of the book. I am also indebted
to M/s Bharat Publications, Yamuna Nagar for bringing out the book in the short time.
It is my earnest belief that no work is ever complete till it has had its share of
criticism and hence I'll be too glad to receive comments and suggestions for the betterment of
the book.
Author

(v)
FORWARD

It is great honour and immense pleasure for me to write a foreward of a book on


Information Theory and Coding by one of my esteemed Colleagues, Mr. Nitin Mittal.
Considering the needs of engineering students and the fact that they hardly get any
exposure to translate technology into practical applications, a basic knowledge in Information
Theory and Coding is essential and to be considered as a main subject in Electronics and
Communication Engineering. To cover the course material for such a vast and wide field, a
comprehensive and easy to understand approach to the subject is required. In this book, the
author has tried to put maximum efforts in this direction. The matter has been presented in well
structured manner, an easy to understand language which can be grasped easily by students of
different disciplines.
Attention has also been paid to the fact that the text as well as diagrams could be
reproduced by the students in theory examinations. A number of review questions given at the
end of each chapter will further enhance the understanding of basic concepts.
I am sure that this book would be quite useful to the students at undergraduate level in
various institutions, along with post graduate aspirants as well.
With my best wishes to the author.

Dr. RAJESH GOEL


Principal,
MIET, Mohri
Kurukshetra

(vi)
CONTENTS

Chapter 1. Probability Theory And Random Variables 1–52


1.1 Introduction 1
1.2 Probability Theory 2
1.2.1 Experiment 2
1.2.2 Sample Space And Events 2
1.2.3 Algebra of Events 3
1.3 Probability of Events 4
1.3.1 Properties of Probability 4
1.4 Conditional Probability 6
1.4.1 Conditional Probability of Independent Events 7
1.4.2 Bayes’ Formula 7
1.5 Random Variables 13
1.5.1. Discrete Random Variables 13
1.5.2. Continuous Random Variables 14
1.6 Probability Distribution of A Discrete Random Variable 14
1.7 Cumulative Distribution Function (CDF) 15
1.7.1 Properties of Cumulative Distribution Function 16
1.8 Probability Density Function (PDF) 17
1.8.1 Properties of Probability Density Function 17
1.9 Two – Dimensional Random Variables 20
1.9.1 Joint Distribution Function 20
1.9.2 Marginal Distribution Function 21
1.9.3 Independent Random Variables 21
1.9.4 Joint Probability Density Function 21
1.9.5 Marginal Probability Density Function 22
1.9.6 Conditional Probability Density Function 22
1.10 Functions of Random Variables 24
1.11 Statistical Averages of Random Variables 26
1.11.1 Expectation 26
1.11.2 Moments And Variance 27
1.11.3 Covariance And Correlation Coefficient 28
1.12 Some Important Distributions 28
1.12.1 The Uniform or Rectangular Distribution 28
1.12.2 The Exponential Distribution 29
1.12.3 Gaussian or Normal Distribution 30
1.12.4 Rayleigh Distribution 32
(vii)
1.13. Characteristic Transformation Functions of Random Variables 34
1.13.1 Properties of Moment Generating Function 35
1.13.2 Determination of Statistical Averages Using MGF 36
1.14 Convergence of A Sequence of Random Variables 37
1.14.1 Law of Large Numbers 37
1.14.2 Central Limit Theorem 38

Chapter 2. Random Processes 53–86


2.1 Introduction 53
2.2. Random Processes 54
2.3 Statistical Averages of Random Process 55
2.3.1 Ensemble Averages 55
2.3.2 Time Averages 56
2.4 Stationary Random Process 57
2.4.1 Strictly Stationary Process 57
2.4.2 Wide Sense Stationary Process 58
2.5 Ergodic Process 58
2.5.1 Properties of Ergodic Random Process 59
2.6 Correlation Function 60
2.6.1 Auto-Correlation Function 61
2.6.2 Cross-Correlation Function 62
2.6.3 Auto Covariance Function 63
2.6.4 Cross Covariance Function 63
2.7 Spectral Densities 64
2.7.1 Power Spectral Density 65
2.7.2 Cross Power Spectral Density 67
2.7.3 Energy Spectral Density 67
2.8 Response of Linear Systems To The Input Random Processes 69
2.9 Special Classes of Random Processes 73
2.9.1 Gaussian Process 73
2.9.2 Markov Process 74
2.9.3 Poisson Process 75
2.9.4 White Noise or White Process 76
2.9.5. Band - Limited White Noise or Process 77

Chapter 3. Elements of Information Theory 87–134


3.1. Introduction 87
3.2. Information Sources 88
3.3. Information: A Measure of Uncertainty 88
3.4 Average Information or Entropy 89
3.4.1. Properties of Entropy 91
3.5 Binary Sources 94
3.6 Extension of A Discrete Memoryless Source 95
(viii)
3.7 Measure of Information For Two - Dimensional Discrete Finite 96
Probability Scheme
3.7.1 Discrete Memoryless Channels 98
3.8 Noise Characteristics of A Channel 101
3.9 Measure of Mutual Information 102
3.9.1 Relationship Among Various Entropies 103
3.9.2 Mutual Information 103
3.9.3 Properties of Mutual Information 104
3.10 Channel Capacity 107
3.11 Channel Capacity of Binary Noise Structures 107
3.11.1 Channel Capacity of A BSC (Binary Symmetric Channel) 108
3.11.2 Channel Capacity of A BEC (Binary Erasure Channel) 109
3.12 Differential Entropy And Mutual Information For Continuous Signals 110
3.13 Shannon’s Theorem On Coding For Memoryless Noisy Channel 113

Chapter 4. Source Encoding 135–184


4.1 Introduction 135
4.2 Source Encoding 136
4.3 Basic Properties of Codes 137
4.4 Separable Binary Codes 139
4.5 Shannon – Fano Encoding 141
4.6 Noiseless Coding Theorem 144
4.7 Theorem of Decodability 149
4.8 Average Length of Encoded Messages 150
4.9 Shannon’s Binary Encoding 152
4.10 Fundamental Theorem of Discrete Noiseless Coding 154
4.11 Huffman’s Minimum – Redundancy Code 156

Chapter 5. Error Control Coding For Digital 185–206


Communication System
5.1 Introduction 185
5.2 Elements of Digital Communication System 186
5.3 Mathematical Models For Communication Channels 192
5.4 Channel Codes 194
5.5 Modulation And Coding 196
5.6 Maximum Likelihood Decoding 200
5.7 Types of Errors 202
5.8 Error Control Strategies 203

Chapter 6. Error Detection And Correction 207–224


6.1 Introduction 207
6.2 Types of Errors 208
(ix)
6.3 Error Detection 209
6.3.1 Parity Check 210
6.3.2 Cyclic Redundancy Check (CRC) 211
6.3.3 Checksum 213
6.4 Error Correction 215
6.4.1 Single – Bit Error Correction 215
6.4.2. Burst Error Correction 219

Chapter 7. Field Algebra 225–254


7.1 Introduction 225
7.2 Binary Operations 225
7.3 Groups 227
7.3.1. Commutative Group 227
7.3.2. Semi – Group 228
7.3.3. Order of A Group 228
7.3.4. Addition Modulo M 228
7.3.5. Multiplication Modulo M 228
7.3.6. General Properties of Groups 230
7.4 Fields 230
7.4.1 Characteristics of The Field 234
7.5 Binary Field Arithmetic 234
7.5.1 Irreducible Polynomial Over GF (2) 236
7.5.2 Primitive Polynomial Over GF (2) 237
7.6 Construction of Galois Field GF (2m) 239
7.7 Basic Properties of Galois Field GF (2m) 243
7.8 Vector Spaces 246
7.9 Matrices 249

Chapter 8. Linear Block Codes 255–282


8.1 Introduction 255
8.2 Repetition Code 256
8.2.1 Majority Voting Decoder 256
8.2.2 Single Error Correcting Repetition Code 256
8.2.3 Advantages And Disadvantages of Repetition Codes 257
8.3 Binary Block Codes 257
8.4 Linear Block Code 258
8.4.1 Systematic Linear Block Code 259
8.4.2 Encoder For Linear Block Code 262
8.4.3 Parity – Check Matrix 263
8.5 Syndrome Calculation For Linear Block Code 264
8.5.1 Properties of The Syndrome (S) 268
8.6 The Hamming Distance of A Block Code 270
8.7 Error – Detecting And Correcting Capabilities 271
(x)
8.8 Syndrome Decoding of Linear Block Code 273
8.9 Burst Error Correcting Block Codes 275
8.10 Other Important Block Codes 277
8.10.1 Hamming Codes 277
8.10.2 Extended Codes 278

Chapter 9. Cyclic Codes 283–308


9.1 Introduction 283
9.2 Cyclic Codes 284
9.3 Generator Polynomial of Cyclic Codes 285
9.4 Parity – Check Polynomial of Cyclic Codes 286
9.5 Systematic Cyclic Codes 288
9.6 Generator And Parity – Check Matrices of Cyclic Codes 290
9.7 Encoder For Cyclic Codes 292
9.8 Syndrome Polynomial Computation 295
9.9 Decoding of Cyclic Codes 297
9.10 Error – Trapping Decoding 299
9.11 Advantages And Disadvantages of Cyclic Codes 301
9.12 Important Classes of Cyclic Codes 301

Chapter 10. BCH Codes 309–338


10.1 Introduction 309
10.2 Binary BCH Codes 310
10.3 Generator – Polynomial of Binary BCH Codes 310
10.4 Parity – Check Matrix of BCH Code 314
10.5 Encoding of BCH Codes 316
10.6 Properties of BCH Codes 318
10.7 Decoding of BCH Codes 318
10.7.1 Syndrome Computation 318
10.7.2 Error Location Polynomial 320
10.8 BCH Decoder Architecture 321
10.8.1 Peterson’s Direct Algorithm 322
10.8.2 Berlekamp’s Iterative Algorithm 326
10.8.3. Chien Search Algorithm 332
10.9 Non - Primitive BCH Code 333
10.10 Non – Binary BCH Codes And RS Codes 334

Chapter 11. Convolutional Codes 339–376


11.1 Introduction 339
11.2 Convolutional Codes 340
11.3 Convolutional Encoder 341
11.3.1 Encoding Using Discrete Convolution 342
11.3.2 Encoding Using Generator Matrix 344
(xi)
11.4. Structural Properties of Convolutional Codes 346
11.4.1. Code – Tree Diagram 346
11.4.2 Trellis Diagram 348
11.4.3 State Diagram Representation 349
11.5 Decoding of Convolutional Code 350
11.5.1 Maximum - Likelihood Decoding 350
11.5.2 The Viterbi Decoding Algorithm 352
11.5.3 Sequential Decoding of Convolutional Codes 356
11.6 Distance Properties of Convolutional Codes 357
11.7 Burst Error Correcting Convolutional Codes 359

Chapter 12. Basic ARQ Strategies 377–388


12.1 Introduction 377
12.2 Automatic Repeat Request (ARQ) 378
12.3 Stop-And-Wait ARQ 379
12.4 Continuous ARQ 381
12.4.1 Go-Back-N ARQ 381
12.4.2. Selective Repeat ARQ 383
12.5 Hybrid ARQ 384

Chapter 13. Cryptography 389–404


13.1 Introduction 389
13.2 Cryptography 390
13.2.1 Need of Cryptography 390
13.2.2 Cryptographic Goals 390
13.2.3 Evaluation of Information Security 391
13.3 Cryptography Components 392
13.4 Symmetric Key Cryptography 393
13.4.1 Symmetric Key encryption / decryption 394
13.4.2 Techniques for coding plain text to chiper text 394
13.4.3 Advantages and disadvantages of symmetric key cryptography 396
13.5 Asymmetric Key Cryptography 396
13.5.1 Public Key encryption / decryption 397
13.5.2 Conversion of plain text to chiper text algorithms 398
13.5.3 Advantages and disadvantages of public-key cryptography 400
13.6 Comparison between symmetric and public-key cryptography 401
13.7 Cryptography Applications 401

Sample Model Papers 405–407


References 40 8
Index 409–412

(xii)
REFERENCES
1. A. Bruce Carlson, Janet C. Rutledge and Crilly, “Communication Systems”, 3rd Ed., Mc
Graw Hill, 2002.
2. A. Papoulis, “Probability, Random Variables and Stochastic Processes”, Mc Graw Hill,
1991.
3. Andrew S Tanenbaum, “Computer Networks”, 3rd ed., Upper Saddle River, NJ: Prentice
Hall, 1996.
4. B. P. Lathi, “Modern Digital and Analog Communication Systems”, Third Edition, Oxford
Press, 2007.
5. B. R. Bhat, “Modern Probability Theory”, New Age International Ltd, 1998.
6. Behrouz A Forouzan, “Data Communication and Networking”, Tata McGraw-Hill, 2003.
7. D. Stinson, “Cryptography: Theory and Practice”, CRC Press, Second edition, 2000.
8. Das, Mullick and Chatterjee, “Digital Communication”, Wiley Eastern Ltd., 1998.
9. Fazlollah M. Reza, “An Introduction to Information Theory”, Dover Publications, Inc.,
New York, 1994.
10. Gregory Karpilovsky, “Field Theory: Classical Foundations and multiplicative groups”,
Tata McGraw-Hill, 1988.
11. Herbert Taub and Donald L Schilling, “Principles of Communication Systems”, 3rd Edition,
Tata McGraw Hill, 2008.
12. J. E. Pearson, “Basic communication theory”, Upper Saddle River, NJ: Prentice Hall,
1992.
13. John G. Proakis, Masoud Salehi, “Fundamentals of Communication Systems”, Pearson
Education, 2006.
14. R. E. Blahut, “Principles and Practice of Information Theory”, Addison-Wesley, Reading,
Mass, 1987.
15. R. P Singh and S. D. Sapre, “Communication Systems – Analog and Digital”, Tata McGraw
Hill, 2nd Edition, 2007.
16. R. M.Gray, L. D Davission, “Introduction to Statistical Signal Processing”, Web Edition,
1999.
17. Robert G. Gallanger, “Information Theory and Reliable Communication”, Mc Graw Hill,
1992.
18. Shu Lin and D. J. Costello, “Error Control Coding: Fundamentals and Applications”,
Prentice Hall, 1983.
19. Simon Haykin, “Communication Systems”, John Wiley & sons, New York, 4th Edition,
2001.
20. W. Stallings, “Cryptography and Network Security: Principles and Practice”, Second
edition, Prentice Hall, 1999.
21. William Stallings, “Data and Computer Communications”, 5th ed., Upper Saddle River,
NJ: Prentice Hall, 1997.

(408)
I NDEX
A –random-error-correcting, 275
Acknowledgement, –systematic, 259
–negative, 379, 382 Burst Error, 202, 211
–positive, 379 Burst Length, 208, 360
Addition,
–modulo–2, 247, 265 C
–modulo–m, 228 Central Limit Theorem, 38
–vector, 247 Channel,
Additive White Gaussian Noise, 192, 197 –additive White Gaussian
Analog-to-digital (A/D) Converter, 185 noise, 115, 192
Arithmetic; –binary erasure, 109
–binary field, 234 –binary symmetric, 108
–Galois field, 252 –burst - error, 203
ARQ, –deterministic, 101
–Continuous, 204, 378, 381 –discrete memoryless, 98, 198
–No-back-N, 204, 381 –lossless, 101
–Hybrid, 204, 384 –noiseless, 102
–Selective-repeat, 204, 381 Channel Capacity, 107
–Stop-and-Wait, 204, 378 Channel encoder, 189
–Type-I hybrid, 204, 386 Characteristic of field, 234
–Type-II hybrid, 204, 387 Checksum, 213
Auto Correlation, 60 Chien search, 330
Cipher-text, 392
B Code efficiency, 137
Bandwidth, 115, 192 Code length, 136
BCH Codes, 303, 309 Code vector, 194
–binary, 310 Codeword, 137, 188
–decoding, 318 Correlation functions,
–encoding, 316 –Auto Correlation, 60
–generator polynomial, 310 –Cross Correlation, 62
–non-binary, 334 Complementary error function, 199
–non primitive, 333 Constraint length, 340
–parity-check matrix, 315 Convolutional Codes, 195, 339
–primitive, 311 –burst-error-correcting, 360
–properties, 318 –constraint length, 340
–syndrome, 318 –distance properties, 358
–syndrome computation, 318 –generator matrix, 345
BCH Decoder, 321 –structural properties, 346
Berlekamp Iterative Algorithm, 326 –transfer function, 358
Binary Erasure Channel (BEC), 87 Coset, 274
Binary Operation, 225 Coset leader, 274
Binary Symmetric Channel (BSC), 87, 202 Cryptography, 389
Block Codes, – applications, 401
–binary, 257 – asymmetric key, 396
–burst-error-correcting, 275 — symmetric key, 393
–hamming, 277 Cyclic Codes,
–interleaved, 276 –decoding, 297
–linear, 258 –encoder, 292
(409)
410 I N F O R M AT I O N T H EO R Y AN D CODING

–generator matrix, 290 Distribution Function,


–generator polynomial, 285 –cummutative distribution function, 15
–Meggitt decoder, 297, 299 –joint distribution function, 20
–parity-check matrix, 291 –marginal distribution function, 21
–parity-check polynomial, 286 DSA algorithm, 399
–syndrome, 295
–systematic, 288 E
Cyclic Shifts, 284 Encoders,
–channel, 189
D –convolutional code, 196, 339
Decoders, –cyclic code, 292
–channel, 191 –linear block code, 262
–maximum-likelihood, 201 –source, 136
–Meggitt, 297, 299 Encoding, 190
–Syndrome, 295 Encryption, 392
Decoding, Entropy, 89
–BCH codes, 318 Ergodicity, 58
–cyclic codes, 297 Error control strategies, 203
–error-trapping, 299 –automatic-repeat request
–maximum likelihood, 200 (ARQ), 204, 377
–syndrome, 273 –for digital storage system, 204
–viterbi, 196, 352 –forward error control, 203, 377
Decryption, 393 Error Correction Capability, 271
Demodulator, 197 Error Detection Capability, 271
Determinist Signals, 1 Error Location Numbers, 321
Digital-to-Analog (D/A) Conversion, 187,191 Error Location Polynomial, 318, 320
Digital Communication System, 186 Error Patterns,
–channel decoder, 191 –correctable, 379, 385
–channel encoder, 189 –detectable, 379
–decryption, 191 –uncorrectable, 385
–demodulator, 191 –undetectable, 265, 379
–destination, 186 Errors,
–encryption, 188 –burst, 202, 277
–information source, 186 –random, 202, 277
–modulator, 190 –transmission, 265
–source decoder, 191 –undetected, 265, 378
–source encoder, 188 Event, 3, 94
–synchronisation, 191 Expectation, 26
Digital signature, 397, 399 Experiment, 2, 94
Discrete Memoryless Channel –head, 2
(DMC), 98,198 –trial, 2
–channel representation, 98 –tail, 2
–channel transition probability, 99
–channel matrix, 99 F
Distance, Field,
–Hamming, 270 –binary, 233
–minimum, 270 –finite, 233
Distance Properties of Convolutional Codes, 357 –Galois, 233, 239
Distributions, –prime, 233
–exponential, 29 Finite Field, 233
–Gaussian, 30 Forward Error Correction, 203
–Rayleigh, 32 Fundamental Theorem of Discrete Noiseless
–uniform, 28 Coding, 154
INDEX 411

G –linear code, 258


Galois Field, 233, 239 –parity-check matrix, 263
Galois Field Arithmetic, 252 –systematic codes, 259
Gaussian Process, 73 Linearity Property, 284
Generator Matrix, Linearly Dependent Vectors, 248
–block codes, 260 Linearly Independent Vectors, 248
–convolutional codes, 345 Location Numbers, 321
–cyclic codes, 291
Generator Polynomial, M
–BCH codes, 310 Markov Process, 74
–cyclic codes, 285 Matrix,
–Galoy codes, 302 –channel, 99
GF(2), 233 –generator, 259
GF(2m ), 234 –identity, 263
GF(P), 233 –parity-check, 263
GF( q), 234 –transpose of, 249
Galoy Codes, 302 Maximum Length Codes, 302
Group, Maximum Likelihood Decoding, 200, 350
–cummutative, 227 Mean, 26
Meggitt Decoder, 297, 299
H Minimal Polynomial, 311
Hamming Bound, 273 Minimum Distance, 270
Hamming Codes, 216, 277 Modulator, 190
Hamming Distance, 270 Modulo-2 addition, 265
Hamming Weight, 270 Modulo-m addition, 228
Huffman's Minimum Redundancy Codes, 156 Modulo-m multiplication, 228
Hybrid ARQ Schemes, 204, 384, 385 Moments, 27
Multiplication,
I –modulo-m, 228
Identity Element, 227 –scalar, 246
Identity Matrix, 263 Mutual Information, 102
Information Source, 186
–source alphabet, 188 N
–symbol, 188 Negative Acknowledgement, 379, 382
Information Theory, Newton's Identities, 323
–average information, 89 ( n,k) Block Code, 196, 257
–DMS, 88, 95 ( n,k, K) Convolutional Code, 196, 341
–information rate, 92 Noiseless Coding Theorem, 144
–Information sources, 88 Non-binary BCH Codes, 334
–Information, 88 Non-primitive BCH Codes, 333
–source alphabet, 88 n–tuple, 195
Interleaving, 277 Null Space, 249
Inverse,
–additive, 232 O
–multiplicative, 231 Optimal Codes, 139
Irreducible Polynomial, 236 Order,
Iterative Algorithm, 326, 328 –of a field, 231
–of a group, 228
L
Law of Large Numbers, 37 P
Linear Block Codes, Parity Check Matrix,
–block code, 257 –BCH codes, 315
–generator matrix, 259 –block codes, 263
412 I N F O R M AT I O N T H EO R Y AN D CODING

–cyclic codes, 291 –digital, 187


Plain text, 392 –encoder, 188
Polynomials, –information, 186
–irreducible, 236 Source Encoding,
–minimal, 311 –code efficiency, 137
–primitive, 312 –code length, 136
Poisson Process, 75 –code redundancy, 137
Positive Acknoweldgement, 379 –distinct codes, 137
Power Spectral Density, 65 –instantaneous codes, 138
Prime Numbers, 233 –Kraft– McMillan inequality, 140
Primitive BCH Codes, 311 –optimal codes, 139
Primitive Elements, 311 –prefix codes, 139
Primitive Polynomials, 237, 312 –uniquely decodable codes, 138
Probability, –variable length codes, 137
–conditional, 6 Space, 246
–properties of, 4 Span, 248
–transition, 98, 199 Spectral Densities,
Probability Density Function, 17 –power spectral densities, 65
–conditional, 22 –energy spectral densities, 67
–joint, 21 State Diagram, 349
–marginal, 22 Stationary Process,
–strict sense, 57
R –wide sense, 58
Random Error Correcting Codes, 277 Substitution techniques, 394
Random Errors, 277 –Mono alphabetic, 394
Random Signals, 2 –Poly alphabetic, 395
Random Process, 54 Syndrome,
Random Variables, 13 –BCH codes, 318
–continuous, 14 –block codes, 264
–discrete, 13 –cyclic codes, 295
Rayleigh Distribution, 32 –decoding, 273
Registers, Syndrome register, 298
–buffer, 383 Systematic codes, 259
–message, 262
–shift, 341 T
–syndrome, 298 Theorem of Decodability, 149
Repetition Code, 256 Throughput Efficiency,
Representation of Galois Fields, 241 –go-back-N ARQ, 204, 383
RSA Algorithm, 398 –selective-repeat ARQ, 204, 384
Retransmission, 378 –stop-and-wait ARQ, 204, 380
Round–trip Delay, 382 Transfer Function of Convolutional Codes, 358
Response of Linear Systems, 69 Transition Probabilities, 99, 199
Transmission Errors, 265
S Transposition techniques, 395
Sample Space, 2 Transpose of a Matrix, 249
Selective-Repeat ARQ, 204, 381, 383 Trellis Diagram, 348
Separable Binary Codes, 139
Sequence, 351 V
Shannons' Binary Code, 152 Vectors, 246
Shannon–Fano Coding, 141 Vector Addition, 247
Single Error Correcting Codes, 310 Vector Space, 246
Source, Veterbi Algorithm, 352
–decoder, 191

You might also like