You are on page 1of 9

L6-Discrete Memoryless

Channels
Introduction
Communication channels could be classified through many methods;
❖ According to the physical structure of the channel to be wire or
wireless channel.
❖Second method of classification is based on the transfer function of
the channel to be linear and non linear channel.
❖ Another method of classification is based on time variation of the
transfer function to be time variant and time invariant channel.
❖One classification method is based on the nature of information
signal to be carried though this channel; continuous or discrete
information signal.
❖ In this chapter we will concern with discrete communication channel
which transfers discrete information signals.
:

Discrete Memoryless Channel (DMC)


• This kind of channel is modeled though three basic items:
• Input Symbols:
• Represents the group of input symbols to the channel emitted from
transmitter denoted by where and N is the total
number of symbols could be emitted from information source (exists
in the transmitter).
• Output Symbols:
• Represents the group of output symbols from the channel delivered
to the receiving end denoted by: b j  where 1  j  M
and M is the total number of symbols appears at the destination
(exists in the receiver). (M  N )
Forward Probability Matrix
• It represents the conditional probability of receiving symbols bj given
that the transmitted symbol is ai and denoted by P(bj / ai).
• Forward probability is estimated at the transmitter blindly to the
actual behavior of the receiver but based only on the behavior of
channel.
• It is also called a- priori probability matrix.
• A memoryless channel means that the probability distribution of the
output depends only on the input at time instant under consideration
and is conditionally independent on previous channel inputs or
outputs.
Discrete Channel Model
• Discrete Channel could be modeled by two methods related to each
other. The first model is graphical which connects input symbols to
the output symbols through forward probabilities written on each
connecting arrow as shown in figure
• Extracted from channel shown model , forward probability matrix
could be constructed as follows:

• Note:
Backward probability
When observing the system from the receiver side (i.e. b j  are

received and estimating what is the probability of transmitting each


symbol of a i ). This kind of conditional probability is called: backward
probability and also called a- posteriori probability given as follows:

Where: is the backward probability representing the probability


of transmitting ai given the received symbol is bj.
• is the input (transmitted) symbol probability.
• is the output (received) symbol probability.
Joint Probability
• It represents the probability joining two actions at the same time;
transmitting symbol ai and receiving symbol bj which means that both
transmitter and receiver are observed. Joint probability could be
obtained as follows:

where
P(ai , b j ) is the joint probability of transmitting symbol ai and
receiving symbol bj.
backward probabilities
• Based on last equation we can represent backward probability in terms of
joint probability as follows:

Output probabilities
It represents set of received symbols probabilities denoted by
which is dependent on the transmitted symbols and obtained as follows:

You might also like