You are on page 1of 1

Shannons Theorem

In information theory, the noisy-channel coding theorem establishes that however


contaminated with noise interference a communication channel may be, it is possible to
communicate digital data (information) nearly error-free up to a given maximum rate
through the channel. This surprising result, sometimes called the fundamental theorem
of information theory, or just Shannon's theorem, was first presented by laude
!hannon in "#$%.
The Shannon limit or Shannon capacity of a communications channel is the theoretical
maximum information transfer rate of the channel, for a particular noise level.
http&''en.wi(ipedia.org'wi(i'!hannon)limit*+athematical)statement
Shannon-Hartley Theorem
The Shannon-Hartley Theorem is an application of Shannons Theorem
onsidering all possible multi-level and multi-phase encoding techni,ues, the !hannon-
.artley theorem states that the channel capacity C, meaning the theoretical tightest upper
bound on the rate of clean (or arbitrarily low bit error rate) data that can be sent with a
given average signal power S through an analog communication channel subject to
additive white /aussian noise of power N, is&
where
C is the channel capacity in bits per second0
B is the bandwidth of the channel in hert10
S is the total signal power over the bandwidth, measured in watt or volt
2
0
N is the total noise power over the bandwidth, measured in watt or volt
2
0 and
S/N is the signal-to-noise ratio (!34) or the carrier-to-noise ratio (34) of the
communication signal to the /aussian noise interference expressed as a linear
power ratio (not as logarithmic decibels).
http&''en.wi(ipedia.org'wi(i'!hannon-.artley)theorem
+arch 25, 266#
Note! Hartley is Ralph Hartley who is the inventor of the Hartley oscillator.

You might also like