Professional Documents
Culture Documents
EXPERIMENT NO. 1
Theory:
1. What is discrete memory less source and discrete memory less
channel.
2. Write definition, formula and units for the following
i} Information
ii) Entropy
iii) Information rate.
iv) Mutual information
3. What are the different types of entropies?
Algorithm:
Part: I
1.Input the no. of inputs of a channel.
2.Input the no. of outputs of a channel.
3.Input the channel matrix. Test the condition that sum of all the
1
INFORMATION THEOTY AND CODING TECHNIQUES
m n
H(X,Y)= ∑ ∑ p( xi yj)(log p( xi yj))-1
i=0 j=0
2
INFORMATION THEOTY AND CODING TECHNIQUES
Part-II
A) Noise free channel
1. For noise free channel enter only diagonal elements of
the joint probability matrix. Condition mentioned on step
no.3 should be satisfied.
2. Repeat steps from 4 to 10.
B) Error free channel
1. A channel is said to be error free if capacity of the
channel is greater than entropy of the channel. So at first
calculate the capacity of the channel using the formula
Capacity C= log M bits/symbol
Where M:- No. of inputs of the channel.
2 2. Calculate entropy of the input.
3.Compare capacity of the channel with channel input
entropy.Test the condition for error free channel .
C) Binary Symmetric Channel
1.BSC channel is characterized by
No.of input = No. of output = 2
2.It’s conditional probability matrix is as follows
1-P P
P(Y/X) =
P 1-P
3. Derive the joint probability matrix from this matrix by
multiplying it by
P(X) = [ 0 1]
So the matrix which we take input from user is
3
INFORMATION THEOTY AND CODING TECHNIQUES
0 0
P(X,Y) = P 1-P
Where P is the probability which user should enter.
4.Then repeat steps 4 to 8 to calculate all the required
quantities.
D) Noisy channel
1.Repeat first two steps from Part I.
2.Enter the joint probability matrix just by removing the
condition of checking sum of all the elements equal to 1.
3.Repeat all the remaining steps to calculate required
quantities.
Conclusion:
1.Why we calculate entropy?
2.What is the significance of conditional entropy?
4
INFORMATION THEOTY AND CODING TECHNIQUES
EXPERIMENT NO. 2
Equipment:
PC with C as a programming tool
Theory:
Algorithm:
1) Accept number of symbols and their respective probabilities.
Sort the probabilities in the descending order.
2) Partition these probabilities such that their sum should be equal
or nearly equal.
3) Assign ‘0’ to upper group symbols and ‘1’ to lower group
symbols.
4) Repeat steps 2 and 3 till all the probabilities are finished.
5) For the formation of code read assigned code from left to right.
5
INFORMATION THEOTY AND CODING TECHNIQUES
Example:
Theory:
1) What are the types of source coding?
2) State advantages of Huffman coding.
3) How we calculate average codeword length and efficiency?
Explain with formula and example.
Algorithm:
6
INFORMATION THEOTY AND CODING TECHNIQUES
the source symbols associated with the first probability; assign 1 to the
second probability.
4) Now go back and assign 0 and 1 to the second digit for the two
probabilities that were combined in the previous reduction step,
retaining all assignments made in step 3.
5) Keep regressing this way until the first column is reached.
Example:
0.1 0011
Conclusion:
Write your own conclusion.
EXPERIMENT NO. 3
7
INFORMATION THEOTY AND CODING TECHNIQUES
Theory:
Channel Coding: In channel coding, we map the incoming data
sequence to a channel input sequence. This encoding procedure is done
by the channel encoder. The encoder sequence is then transmitted over
the noisy channel. The channel output sequence at the receiver is
inverse mapped an input data sequence. This is called as the decoding
procedure and is carried out by the channel decoder. But the encoder
and decoder are under the designer’s control.
The encoder introduces redundancy in the prescribed manner.
The decoder exploits redundancy in a prescribed manner in order to
reconstruct the source sequence as accurately as possible. The channel
coding makes possible reliable communication over unreliable
channels. Channel coding is also called as ‘error control coding’.
Questions: 1) Explain the necessity of the channel coding.
2) Define and explain the following terms.
1) Weight of the code
2) Minimum distance
3) Syndrome
4) Hamming bound.
Encoding and decoding procedure of the linear block code:
Consider the (n,k) systematic linear block code(LBC) code, where n is
length of the codeword and k is the length of the message.
8
INFORMATION THEOTY AND CODING TECHNIQUES
Encoding : Let’s consider the (6,3) LBC with the following parity
matrix
1 0 1
P = 0 1 1
1 1 0
Here n=6 and k=3 hence no. of redundant bits are (n-k) =3.
G = [ In-k: P]
1 0 0 1 0 1
G = 0 1 0 0 1 1
0 0 1 1 1 0
Now from the above generator matrix and message vector we can
calculate the codewords.
No. of message vectors = 2k = 23 = 8
For the first message M0 = [ 0 0 0], codeword C0 is
1 0 0 1 0 1
C 0 = [ 0 0 0] 0 1 0 0 1 1
0 0 1 1 1 0
C 0 = [ 0 0 0 0 0 0]
9
INFORMATION THEOTY AND CODING TECHNIQUES
In the same way calculate the code vectors for all the eight message
vectors. All codewords are as follows
Data Codeword
000 000000
001 001110
010 010011
011 011101
100 100101
101 101011
110 110110
111 111000
Decoding:
The parity check matrix P is
1 0 1
P = 0 1 1
1 1 0
1 0 1
P = 0 1 1
1 1 0
10
INFORMATION THEOTY AND CODING TECHNIQUES
1 0 1
0 1 1
1 1 0
St =
1 0 0
0 1 0
0 0 1
Algorithm:
1) Enter the values of n and k.
2) Enter the parity matrix.
3) Calculate generator matrix G.
4) Enter the data matrix Mi.e. any one combination out of the
total possible combinations.
5) Calculate the code using C = M * G
6) In the same way calculate all the codewords.
11
INFORMATION THEOTY AND CODING TECHNIQUES
Conclusion :
What are the other types linear block code? Also comment on
error correcting capability of linear block code.
EXPERIMENT NO. 4
12
INFORMATION THEOTY AND CODING TECHNIQUES
A code C is cyclic if
1. C is linear code and
2. any cyclic shift if a codeword is also a codeword i.e. if the
Que. List out the properties of the cyclic codes and explain them in
order.
C(P) = Q(P)*G(P)
13
INFORMATION THEOTY AND CODING TECHNIQUES
P3 + P 2 + P + 1
P3 + P + 1 P 6 + P 5 + P 3
P6 + P 5 + P 3
P5 + P 4
P5 + P 3 + P 2
P4 + P 3 + P 2
P4 + P 2 + P
P3 + P
P3 + P + 1
1
Q(P) = P3 + P2 + P + 1
b) Decoding:
Let the received codeword is 1111001
Hence the corresponding polynomial is
R(P) = P6 + P5 + P4 + P3 + 1
14
INFORMATION THEOTY AND CODING TECHNIQUES
P3 + P 2 + 1
P3 + P + 1 P 6 + P 5 + P 4 + P 3 + 1
P6 + P 4 + P 3
P5 + 1
P5 + P 3 + P 2
P3 + P 2 + 1
P3 + P + 1
P2 + P
Hence syndrome is S(P) = P2 + P
In the next step we have to consider error pattern E(P).
Let E(P)= 1000000
In polynomial form E(P) = P6
P3 + P + 1
P3 + P + 1 P 6
P6 + P 4 + P 3
P4 + P 3
P4 + P 2 + P
P3 + P 2 + P
P3 + P + 1
P2 +1
But in this case S(P) != S’(P). So error is not present at position 6. Same
situation occurs when we consider error at position 5.
15
INFORMATION THEOTY AND CODING TECHNIQUES
C(P) = P6 + P5 + P3 + 1
C = 1101001
Algorithm:
1. Start
2. Accept the values of n,k,G(P) , M(P).
3. Evaluate the value of [M(P)*Xn-k] / G(P)
Conclusion:
Que.1. When linear code is called as cyclic code?
2. What are the properties of the syndrome table?
16
INFORMATION THEOTY AND CODING TECHNIQUES
EXPERIMENT NO. 5
17
INFORMATION THEOTY AND CODING TECHNIQUES
i] Code Tree
ii] Trellis code
Convolutional Encoding:
18
INFORMATION THEOTY AND CODING TECHNIQUES
An input bit m1 is fed into the leftmost register. Using the generator
polynomials and the existing values in the remaining registers, the
encoder outputs n bits. Now bit shift all register values to the right (m1
moves to m0, m0 moves to m-1) and wait for the next input bit. If there
are no remaining input bits, the encoder continues output until all
registers have returned to the zero state.The figure below is a rate 1/3
(m/n) encoder with constraint length (k) of 3. Generator polynomials are
G1 = (1,1,1), G2 = (0,1,1), and G3 = (1,0,1). Therefore, output bits are
calculated (modulo 2) as follows:
n1 = m1 + m0 + m-1
n2 = m0 + m-1
n3 = m1 + m-1.
Questions:
1] Define constraint length & code rate.
2] Design of convolution encoder for
constraint length N=3
19
INFORMATION THEOTY AND CODING TECHNIQUES
2] The encoder has memory M=K-1=2 msg bits.Hence when third msg
bit enters the encoder,the 1st msg bit is shifted out of the regi.
3] In the code tree starting with 1 & 0, if there is ‘1’ in the i/p seq.then
go downward.(This is showm by dotted line) & note down correct code
written on that line.
4] If there is ‘0’ in the i/p seq then go upward(shown by solid line)
& note down code written on that line.
5] Thus trace the code-tree upto level = no of bits in i/p sequence to get
the corresponding o/p seq.
2] If there is ‘1’ in k0.then trace lower i.e. dotted line & note down code
written above the line.
Thus for k0 = 1 1 0 1 0 0 0
We get n0 = 11 01 01 00 10 11 00
20
INFORMATION THEOTY AND CODING TECHNIQUES
Conclusion:
Que. What are the disadvantages of code tree and advantages of
code trellis?
EXPERIMENT NO. 6
21
INFORMATION THEOTY AND CODING TECHNIQUES
Theory:
Convolutional decoding can be performed using a Viterbi
algorithm. A Viterbi decoder logically explores in parallel every
possible user data in sequence. It encodes and compare each one against
the received sequence and picks up the closest match: it is a maximum
likelihood decoder. To reduce the complexity (the number of possible
data sequence double with each additional data bit), the decoder
recognizes at each point that certain sequences cannot belong to the
maximum likelihood path and it discards them. The encoder memory is
limited to K bits; a Viterbi decoder in steady-state operation keeps only
paths. Its complexity increases exponentially with the constraint
length K.
22
INFORMATION THEOTY AND CODING TECHNIQUES
Conclusion:
1) Why viterbi decoding is efficient?
2) What is the difficulty in its application?
23
INFORMATION THEOTY AND CODING TECHNIQUES
EXPERIMENT NO. 7
Aim: Study of radio link design
Theory:
A) LINK BUDGET ANALYSIS:
Link budget analysis is an important issue in the design of
satellite communication system. A ‘link budget’ or ‘link power budget’
24
INFORMATION THEOTY AND CODING TECHNIQUES
C) ANTENNA:
In communication system, the propagation of the modulated
signal is accomplished by means of transmitting and receiving antenna.
Functions of transmitting antenna:
1. To convert the electrical signal to electromagnetic
signal.
2. To radiate the electromagnetic energy in desired
directions.
Functions of receiving antenna:
1. To convert the electromagnetic signal to electrical
signal.
25
INFORMATION THEOTY AND CODING TECHNIQUES
Conclusion:
Que. How the link budget analysis helps in the design of
radio link?
26