You are on page 1of 17

Channel capacity is simply the rate at which

information can be transmitted through a


channel. A communication channel is the
path or medium through which the symbol
flows to the receiver.
I=ktB
where
I = amount of information to be sent
k = a constant
t = time available
B = channel bandwidth
C = 2B log
2
M
Where
C = information capacity in bits per second
B = the channel bandwidth in hertz
M = number of levels transmitted
C = 2BW
C = B log
2
(1 + S/N)
Where
C = information capacity in bits per second
B = bandwidth in hertz
S/N = signal-to-noise ratio (as a power
ratio, not in decibels)
log
2
N = log
10
N/log
10
2
A channel described by a channel matrix with only one
non-zero element in each column. (source entropy)
C = log
2
N
Described by a channel matrix with only one non-zero
element in each row. (destination entropy)
C = log
2
M
A channel that is both lossless and deterministic.
C = log
2
N = log
2
M
The channel is symmetric because the probability of
receiving a 1 if a 0 is sent is the same as the probability of
receiving a 0 if a 1 is sent. the common transition
probability is denoted by p.
Sample problems:
1. What is the Shannon limit for information
capacity for a standard voice band
communications channel with a signal-to-
noise ratio of 30dB and a bandwidth of 2.7
kHz?
2. What is the bandwidth needed to support a
capacity of 20,000 bps (using Shannon theory),
when the ratio of power to noise is 200?
3. A binary digital signal is to be transmitted at 10
kbps, what absolute minimum bandwidth is
required to pass the fastest information change
undistorted?
4. What bandwidth is needed to support a
capacity of 128 kbps when the signal power to
noise power ratio in decibels is 100?
5. A broadcast TV channel has a bandwidth of 6
Mhz. ignoring noise, calculate the maximum
data rate that could be carried in a TV channel
using a 16-level code and determine the
maximum possible signal-to-noise ratio in dB
for the calculated data rate.

You might also like