Professional Documents
Culture Documents
Aims:
The aim of this part of the course is to give you an understanding of
how communication systems perform in the presence of noise.
Objectives:
Reference books:
Course material:
http://www.ee.ic.ac.uk/dward/
Lecture 1
2.
1.
Definitions
Lecture 1
Lecture 1
Lecture 1
Lecture 1
16
Definitions
Definitions
Instantaneous power:
p=
v(t )
Average power:
= i (t ) R = g (t )
1
T T
P = lim
T /2
T / 2
g (t ) dt
P=
Lecture 1
17
Definitions
1
To
To / 2
To / 2
g (t ) dt
Lecture 1
18
Bandwidth
Magnitude-square spectrum
3dB bandwidth
null-to-null bandwidth
Lecture 1
19
Lecture 1
20
Phasors
General sinusoid:
Phasors
x(t ) = Ae e
j 2 f t
Lecture 1
x(t ) =
21
Phasors
Lecture 1
A j j 2 f t A j j 2 f t
e e
+ e e
2
2
1.
32
2.
A j j 2 f t A j j 2 f t
e e
+ e e
2
2
Summary
x(t ) =
Alternative representation:
Signals
Average power
43
Lecture 1
T /2
T / 2
x(t ) dt
x(t ) =
Lecture 1
1
T T
P = lim
A j j 2 f t A j j 2 f t
e e
+ e e
2
2
44
Lecture 2
1.
2.
Sources of noise
Lecture 2
Sources of noise
2.
Example
External noise
synthetic (e.g. other users)
atmospheric (e.g. lightning)
galactic (e.g. cosmic radiation)
Internal noise
shot noise
thermal noise
1.
Lecture 2
1.
2.
3.
20
P = kTB
Te =
Lecture 2
P
kB
Lecture 2
Gaussian noise
Noise model
Lecture 2
Random variable
Motivation
Lecture 2
Data
Data +
noise
1
0
Error for 0
Error for 1
Lecture 2
Lecture 2
Statistical averages
E{x} =
x2
P( x1 < x < x2 ) = px ( x) dx
p x ( x)
In general, if y = g(x)
then
2
1
e ( x m )
2
x px ( x) dx
x1
px ( x) =
E{y} = E{g(x)} =
g ( x) px ( x) dx
For example, the mean square amplitude of a signal is the mean of the
square of the amplitude, ie,
{ }
E x2
Lecture 2
10
Averages
Random process
Lecture 2
Time average:
1 T /2
n(t ) dt
T T T / 2
n(t ) = lim
Ensemble average:
E{n} =
n p(n) dn
pdf
Ergodic
DC component:
Average power:
Stationary
Lecture 2
11
Lecture 2
random variable
E{n(t )} = n(t )
Autocorrelation
x(t ) = A e j ( ct + )
Example
Autocorrelation:
Rx ( ) = E{x (t ) x (t + )}
NOTE: Average power is
P = E x 2 (t )
= Rx (0)
Lecture 2
13
Original
speech
14
Frequency content
Lecture 2
H(f)
Wiener-Khinchine theorem:
Filtered speech
(smaller bandwidth)
Sx ( f ) = Rx ( ) e j 2 f d
= FT{Rx ( )}
Hence,
Rx ( ) = S x ( f ) e j 2 f df
Average power:
P = Rx (0) = S x ( f ) df
LPF 1kHz
15
Lecture 2
16
Summary
Thermal noise:
P= kTB =
S( f ) =
Autocorrelation:
Rx ( ) = E{x (t ) x (t + )}
S x ( f ) = Rx ( ) e j 2 f d
No
2
White noise:
kT
df
2
White noise:
S( f ) =
Expectation operator:
No
2
E{g ( x)} =
Lecture 2
17
Lecture 2
[ g ( x)] px ( x) dx
random
variable
18
Lecture 3
Lecture 3
Receiver
Lecture 3
Bandlimited Noise
For any bandpass (i.e., modulated) system, the predetection
noise will be bandlimited
Bandpass noise signal can be expressed in terms of two
baseband waveforms
Predetection Filter
Detector
bandpass
Predetection filter:
removes out-of-band noise
has a bandwidth matched to the transmission bandwidth
baseband
carrier
baseband
carrier
Lecture 3
Lecture 3
PSD of n(t)
2W
nk (t ) = ak cos( k t + k )
let k = ( k c ) + c
nk (t ) = ak cos[( k c )t + k + ct ]
A
nk (t ) = ak cos[( k c )t + k ]cos( c t )
nk (t ) = ak cos( 2 f k t + k )
ak sin[( k c )t + k ]sin( c t )
ns (t ) term
Lecture 3
Example frequency
Lecture 3
Example time
W=1000 Hz
Lecture 3
W=1000 Hz
n(t)
n(t)
nc(t)
nc(t)
ns(t)
ns(t)
Lecture 3
10
Example histogram
n(t ) = ak cos( k t + k )
k
nc (t ) = ak cos[( k c )t + k ]
n(t)
ns (t ) = ak sin[( k c )t + k ]
nc(t)
ns(t)
Lecture 3
11
Lecture 3
Average power
What is the average power in n(t) ?
Average power
12
n(t ) = ak cos( k t + k )
k
2W
Power in ak cos(t+) is E{ak2}/2 (see Example 1.1, or study group sheet 2, Q1)
E{ak }
2
nc (t ) = ak cos(( k c )t + k )
From Lecture 2:
ns (t ) = ak sin (( k c )t + k )
Average power in nc(t) and ns(t) is: Pnc =
13
f c +W
f c W
E{ak }
E{ak }
, Pns =
2
2
k
Lecture 3
= 2
(one for positive freqs,
one for negative)
P = S ( f ) df
No
N
df = 2 o (W W ) = 2 N oW
2
2
Lecture 3
14
n(t)
PSD of n(t):
nc(t)
2W
ns(t)
--W
Lecture 3
15
Example - zoomed
Lecture 3
16
Phasor representation
W=1000 Hz
let g (t ) = nc (t ) + jns (t )
nc(t)
so n(t ) = g (t ) e j ct
ns(t)
Lecture 3
17
Lecture 3
18
Performance measure:
SNRo =
Baseband system
Lecture 3
19
20
Summary
Baseband SNR:
Baseband SNR
Lecture 3
SNRbase =
No
df = NoW
W 2
W
SNRbase =
Lecture 3
PT
N oW
PN =
PT
N oW
21
Lecture 3
22
Lecture 4
SNRo =
Lecture 4
Amplitude modulation
Lecture 4
Amplitude modulation
Modulated signal:
s (t ) AM = [Ac + m(t )]cos ct
mp
Ac
Lecture 4
Lecture 4
DSB-SC
Synchronous detection
s(t )DSB-SC = Ac m(t ) cosct
SNR of DSB-SC
Transmitted signal:
Noise in DSB-SC
Lecture 4
Ps = ( Ac m(t ) ) = Ac2 P
2
power of message
Predetection signal:
Transmitted signal
PN = PSD df = N o df = 2N oW
Bandlimited noise
PSD of nc(t)
y (t ) = Ac m(t ) + nc (t )
Lecture 4
Output SNR:
SNRDSB-SC =
7
Lecture 4
Ac2 P
2 N oW
SNR of DSB-SC
AP
2
Transmitted signal
PT = ( Ac m(t ) cos c t ) 2 =
2
c
SNRDSB-SC
Receiver output:
y (t ) = Ac + m(t ) + nc (t )
P
= T = SNRbase
N oW
Output SNR:
PS = m 2 (t ) = P
DSB-SC has no performance advantage over baseband
Bandlimited noise
Transmitted power:
Predetection signal:
Output SNR:
SNRAM =
P
2 N oW
PN = 2 N oW
Lecture 4
10
Transmitted signal:
Predetection signal:
Transmitted power:
A2 P
PT = c +
2 2
Transmitted signal
Bandlimited noise
Output SNR:
SNRAM =
Lecture 4
PT
P
P
SNRbase
= 2
2
N oW Ac + P Ac + P
Lecture 4
11
Lecture 4
13
Receiver output:
y (t ) Ac + m(t ) + nc (t )
= output of synchronous detector
y (t ) = envelope of x(t)
y (t ) En (t ) + [ Ac + m(t )] cos n (t )
Lecture 4
y(t)
14
15
Summary
Synchronous detector:
SNRDSB-SC = SNRbase
SNRAM =
P
SNRbase
A +P
2
c
Envelope detector:
threshold effect
for small noise, performance is same as synchronous
detector
1.
Lecture 4
Example
Lecture 4
16
Lecture 4
17
Lecture 5
Frequency modulation
FM waveform:
Noise in FM systems
pre-emphasis and de-emphasis
(See section 3.4)
s (t ) FM = Ac cos 2f c t + 2k f
= Ac cos (t )
m( )
d
fi =
FM waveforms
Lecture 5
FM frequency deviation
Instantaneous frequency varies between fc-kf mp and fc+kf mp,
where mp is peak message amplitude
Message:
AM:
Deviation ratio:
f
=
W
where W is bandwidth of message signal
FM:
Lecture 5
Lecture 5
Bandwidth considerations
FM receiver
Carsons rule:
BT 2W ( + 1) = 2(f + W )
Lecture 5
Noise in FM versus AM
Lecture 5
Noise in FM
Predetection signal:
FM:
Frequency of modulated
signal carries message
Zero crossings of modulated
signal important
Effect of noise should be less
than in AM
Transmitted signal
Lecture 5
Bandlimited noise
Amplitude of modulated
signal carries message
Noise adds directly to
modulated signal
Performance no better than
baseband
AM:
Lecture 5
Assumptions
Assumptions
2. Signal does not affect noise at output
xs (t ) = Ac cos(2f c t + (t ) )
Instantaneous frequency:
1 d (t )
= k f m(t )
f i (t ) =
2 dt
Instantaneous frequency:
Lecture 5
10
1
d
2 Ac dt
ns (t )
f i (t )
X(f )
x(t ) X ( f )
dx(t )
j 2 f X ( f )
dt
1
j 2 f
2 Ac
Y( f ) = H ( f )X ( f )
SY ( f ) = H ( f ) S X ( f )
SX ( f )
Ns ( f )
SN ( f )
Ns ( f )
H( f )
11
1 dns (t )
f i (t )
2 Ac dt
Lecture 5
PSD property:
Discriminator output:
Lecture 5
PS = k 2f P
1 d (t ) 1 d
ns (t )
1 d ns (t )
tan 1
=
2 dt
2 dt
Ac + nc (t )
2 dt Ac
f i (t ) =
f
Ac
Fi ( f )
SF ( f ) =
| f |2
SN ( f )
2
Ac
Fi ( f )
12
Lecture 5
13
PSD of
PSD of ns(t)
1 dns (t )
PSD of
2 A dt
2 N oW 3
2
3 Ac
Lecture 5
14
SNR of FM
Lecture 5
3 Ac2 k 2f P
2 N oW 3
Transmitted power:
Ac2
=
2
PT = ( Ac cos[ c t + (t )])
SNR at output:
3k 2f P
W2
SNRbase =
3 2 P
SNRbase
mp
Ac2
2 N o BT
Ac2
> 10
4 N oW ( + 1)
SNRFM =
15
Threshold effect in FM
SNR at output:
Lecture 5
SNRo =
f3
3
| f |2
2N
N o df = 2o
PN =
2
W
Ac
Ac
W
16
Lecture 5
18
Example
The improvement in output SNR afforded by using preemphasis and de-emphasis in FM is defined by:
SNR with pre - /de - emphasis
SNR without pre - /de - emphasis
average output noise power without pre - /de - emphasis
=
average output noise power with pre - /de - emphasis
I=
Lecture 5
19
20
Parameters:
Single-tone message m(t ) = cos(2
Message bandwidth W = f m
AM system
=1
FM system
=5
Lecture 5
SNRDSB SC = SNRbase
SNRAM = 13 SNRbase
SNRFM = 752 SNRbase
Lecture 5
Bandwidth:
Performance:
f mt )
BDSB SC = 2W
B AM = 2W
BFM = 12W
21
Lecture 5
22
Summary
Noise in FM:
Increasing carrier power reduces noise at receiver output
Has threshold effect
Pre-emphasis
Lecture 5
23
Lecture 6
Digital vs Analog
Analog message:
Digital message:
Lecture 6
Digital vs Analog
Digital:
Decide which symbol was
sent
Performance criterion is
probability of receiver
making a decision error
Recreate waveform
accurately
Performance criterion is
SNR at receiver output
Analog:
Lecture 6
Advantages of digital:
1.
2.
Lecture 6
Lecture 6
Pulse-code modulation
Represent an analog waveform in digital form
Channel
B Hz
Lecture 6
Lecture 6
Encode
Lecture 6
Lecture 6
Sampling vs Quantization
Quantization noise
Sampling:
Non-destructive if fs>2W
Can reconstruct analog waveform exactly by using a low-pass filter
Sampled signal
Quantization:
Destructive
Once signal has been rounded off it can never be reconstructed
exactly
Quantized signal
(step size of 0.1)
Quantization error
Lecture 6
Quantization noise
Lecture 6
10
Quantization noise
2m p
( )
L=2n
or in dB:
Assume q is zero mean with uniform pdf, so mean square error is:
E{q 2 } = q 2 p (q) dq
=
/2
/ 2
Lecture 6
q2
dq =
PS 3P 2 n
=
2
PN m 2p
SNRQ =
2
12
11
Lecture 6
3P
m 2p
dB
12
Bandwidth of PCM
Nonuniform quantization
For audio signals (e.g. speech), small signal amplitudes occur more
often than large signal amplitudes
Better to have closely spaced quantization levels at low signal
amplitudes, widely spaced levels at large signal amplitudes
Quantizer has better resolution at low amplitudes (where signal spends
more time)
BT = nW
3P 2 BT / W
2
m 2p
Lecture 6
13
Nonuniform quantization
Lecture 6
Non-uniform quantization
15
Companding
Lecture 6
16
Lecture 6
17
Summary
Pulse-code modulation:
Lecture 6
18
Lecture 7
Digital receiver
s(t)
Lecture 7
Lecture 7
s(t)
1
( n m) 2
exp
2 2
2
= (m, 2)
Normal distribution
p ( n) =
mean, m
variance, 2
Lecture 7
Lecture 7
White noise
Transmitted signal
s0(t)
Noise signal
n(t)
P=
Received signal
y0(t)= s0(t)+n(t)
f c +W N
No
o
df +
df
f c W
f c W
2
2
= 2 N oW
f c +W
Lecture 7
Error if yo (t ) > A / 2
Lecture 7
A/ 2
N(0, 2 ) dn
6
Possible errors:
1.
Symbol 0 transmitted, receiver decides 1
2.
Symbol 1 transmitted, receiver decides 0
Total probability of error:
Pe = po Pe 0 + p1 Pe1
Received signal
y1(t)= s1(t)+n(t)
A/ 2
Noise signal
n(t)
Pe1 =
Lecture 7
Transmitted signal
s1(t)
Error if y1 (t ) < A / 2
Pe 0 =
P=
No
df = N oW
2
Probability of
0 being sent
Probability of making an
error if 0 was sent
N ( A, 2 ) dn
7
Lecture 7
1
n2
exp
dn
A/ 2
2 2
2
Pe =
u
Q(u ) =
A/ 2
N (0, 2 ) dn
Lecture 7
u
exp
n2
dn
2
%"$ &
!" #
Pe =
A
2 2
%'$ $ &
!' #
Pe1
exp( n 2 ) dn
erfc(u ) =
Pe = 12 erfc
Pe0
Lecture 7
10
Example
Lecture 7
11
Lecture 7
12
Amplitude-shift keying
Synchronous detector
s0 (t ) = 0
s1 (t ) = A cos( c t )
Lecture 7
13
ASK 0 transmitted
Lecture 7
ASK 1 transmitted
Predetection signal:
Predetection signal:
x0 (t ) = 0 + nc (t ) cos( c t ) ns (t ) sin( c t )
ASK 0
14
Bandpass noise
ASK 1
After multiplier:
Bandpass noise
Receiver output:
y0 (t ) = nc (t )
Receiver output:
y1 (t ) = A + nc (t )
Lecture 7
r0 (t ) = x0 (t ) 2 cos( c t )
= nc (t ) 2 cos 2 ( c t ) ns (t )2 sin( c t ) cos( c t )
= nc (t )[1 + cos( 2 c t )] ns (t ) sin( 2 c t )
After multiplier:
r1 (t ) = x1 (t ) 2 cos( c t )
= [ A + nc (t )] 2 cos 2 ( c t ) ns (t ) 2 sin( c t ) cos( c t )
= [ A + nc (t )][1 + cos( 2 c t )] ns (t ) sin( 2 c t )
15
Lecture 7
16
ASK 0 transmitted:
Phase-shift keying
PDF of y0 (t ) = nc (t )
PDF of y1 (t ) = A + nc (t )
s0 (t ) = A cos( c t )
s1 (t ) = A cos( c t )
PSK
FSK
Pe,ASK = 12 erfc
+**,
.--/
Same as baseband!
A
2 2
Lecture 7
17
PSK demodulator
Lecture 7
18
PSK 0 transmitted
Predetection signal:
x0 (t ) = A cos( c t ) + nc (t ) cos( c t ) ns (t ) sin( c t )
PSK 0
Bandpass noise
After multiplier:
r0 (t ) = x0 (t ) 2 cos( c t )
= [ A + nc (t )] 2 cos 2 ( c t ) ns (t ) 2 sin( c t ) cos( c t )
= [ A + nc (t )][1 + cos( 2 c t )] ns (t ) sin( 2 c t )
0 0 0
Lecture 7
Receiver output:
y0 (t ) = A + nc (t )
19
Lecture 7
20
PSK 1
PSK 0 transmitted:
Predetection signal:
x1 (t ) = A cos( c t ) + nc (t ) cos( c t ) ns (t ) sin( c t )
PDF of y0 (t ) = A + nc (t )
Bandpass noise
PSK 1 transmitted:
PSK 1 transmitted
PDF of y1 (t ) = A + nc (t )
After multiplier:
r1 (t ) = x1 (t ) 2 cos( c t )
= [ A + nc (t )] 2 cos 2 ( c t ) ns (t ) 2 sin( c t ) cos( c t )
= [ A + nc (t )][1 + cos( 2 c t )] ns (t ) sin( 2 c t )
Set threshold at 0:
Receiver output:
y1 (t ) = A + nc (t )
Lecture 7
21
Lecture 7
22
Frequency-shift keying
PSK
432 2 5
476 6 8
=
(n + A) 2
1
exp
dn
2 2
2
4:9 9 ;
Pe1 = N( A, 2 )
Pe 0 = N( A, 2 )
(n A) 2
1
exp
dn
2 2
2
=
FSK
DAC E
@A? B
Lecture 7
23
Lecture 7
s0 (t ) = A cos( 0t )
s1 (t ) = A cos(1t )
24
FSK detector
FSK
Receiver output:
y0 (t ) = A + n1c (t ) nc0 (t )
y1 (t ) = A + nc1 (t ) nc0 (t )
s0 (t ) = A cos( 0t )
s1 (t ) = A cos(1t )
Lecture 7
25
A
1
erfc
2
2
LIK M
HIG J
Pe,FSK =
Lecture 7
26
Summary
Pe =
A/ 2
A
2
2
N (0, 2 ) dn = 12 erfc
LIK M
HIG J
DAC E
@A? B
Lecture 7
27
Lecture 7
28
Lecture 8
Information theory
Why?
Information
Entropy
Source coding (a little)
Lecture 8
Information
Lecture 8
Properties of I(s)
I(s)
I ( s ) = log 2 ( p )
Definition:
I ( s ) = log 2
Information in
symbol s
Lecture 8
1
= log 2 ( p )
p
Probability of
occurrence of
symbol s
bits
p
bits
1.
2.
Conventional unit
of information
3.
Lecture 8
If p=1, I(s)=0
(symbol that is certain to occur conveys no information)
0<p<1, <I(s)<0
If p=p1p2, I(s)=I(s1)+I(s2)
Example
Symbols:
may be binary (0 and 1)
can have more than 2 symbols, e.g. letters of the alphabet, etc.
Sequence of symbols is random (otherwise no information is conveyed)
Definition:
If successive symbols are statistically independent, the information
source is a zero-memory source (or discrete memoryless source)
Lecture 8
Entropy
Lecture 8
Definition:
S = {s0 , s1 }
H ( S ) = pk log 2 ( pk )
S = {s1 , s2 ,
Probabilities:
, s K } is alphabet
H ( S ) = pk log 2 ( pk )
symbol
all k
pk = 1
all k
= (1 p1 ) log 2 (1 p1 ) p1 log 2 ( p1 )
Lecture 8
p0 = 1 p1
Entropy:
where
all k
Lecture 8
Alphabet:
S = {A, B, C }
Probabilities:
Entropy:
p A = 0.7,
Code words
p B = 0.2,
pC = 0.1
H ( S ) = pk log 2 ( pk )
all k
= 1.157 bits/symbol
Symbols generated at
rate of 1 symbol/sec
Bitstream generated at
rate of 2 bits/sec
System needs to
process 2 bits/sec
Source coding
Lecture 8
10
Examples
Telephone:
Speech waveform
8000 symbols/sec
64000 bits/sec
System needs to
process 64 kb/s
8000 symbols/sec
13000 bits/sec
System needs to
process 13 kb/s
Cell phone:
Speech waveform
Lecture 8
11
Lecture 8
12
Source coding
pB = 0.2,
pC = 0.1
Example:
let A = 0
B = 10
C = 11
Lecture 8
6 symbols
13
8 bits
Lecture 8
14
Source coding
Definition:
L = pk l k
Probability of occurrence
of symbol sk
all k
15
Lecture 8
16
Summary
1
= log 2 ( p )
p
bits
bits/symbol
all k
Source coding:
How many bits do we need to represent each symbol?
Lecture 8
17
Lecture 9
Source coding
pB = 0.2,
pC = 0.1
A = 0, B = 10, C = 11
6 symbols
Lecture 9
Source coding
probabilities
code words
8 bits
Lecture 9
Example:
S = {A, B}
Alphabet:
Probabilities:
Code words:
p A = 0.5, pB = 0.5
A = 0, B = 1
l = log 2
Lecture 9
Lecture 9
1
= log 2 (n )
p
4
Unequal probabilities?
p ( S N ) = p1
, pK
Np2
1
Np
Np
= log 2 p1 1 p2 2
p( S N )
= Np1 log 2 ( p1 ) Np2 log 2 ( p2 )
= N pk log 2 ( pk ) = N H ( S )
l N = log 2
S N = {s1 , s2 , s1 , s3 , s3 , s2 , s1 ,
p2
Np1
p1 , p2 ,
Probabilities:
, sK }
S = {s1 , s2 ,
Alphabet:
all k
p ( S N ) = p1 p2 p1 p3 p3 p2 p1
Np
Np
= p1 1 p2 2
L=
Lecture 9
lN
= H (S )
N
Lecture 9
Significance:
For any practical source coding scheme, the average codeword length
will always be greater than or equal to the source entropy, i.e.,
L H (S ) bits/symbol
Lecture 9
Lecture 9
Uniquely decodable
Example
i.e., only one way to break bit stream into valid code words
Symbols : A, B, C , D, E
Probabilities : p A = 0.05, p B = 0.15, pC = 0.4, p D = 0.3, p E = 0.1
Instantaneous
i.e., know immediately when a code word has ended
1.
2.
Lecture 9
Summary
Lecture 9
11
Lecture 9
10
Lecture 10
Lecture 10
Information rate
Channel capacity
Definition:
R=r H
Avg no. of information
bits transferred per second
Lecture 10
Definition:
Channel capacity, C, is maximum rate of information
transfer over a noisy channel with arbitrarily small
probability of error
bits/sec
Intuition:
R can be increased arbitrarily by increasing symbol rate r
For noisy channel, errors are bound to occur
Is there a value of R where probability of error is arbitrarily small?
Lecture 10
Lecture 10
Channel capacity
Channel capacity
Hartley-Shannon Theorem
For an additive white Gaussian noise channel, the
channel capacity is:
P
C = B log 2 1 + S
PN
Bandwidth of
the channel
No
df =N o B
B 2
PN =
1.
2.
3.
C = B log 2 1 +
Channel capacity:
Lecture 10
Example
Example
Lecture 10
PS
No B
Lecture 10
Comments
Example
C = B log 2 1 +
PS
PN
B=4000
No=10-6
More bandwidth allows more symbols per second, but also increases
the noise
Ps=10
No=10-6
lim C = 1.44
PS
No
Lecture 10
10
More comments
Lecture 10
C = B log 2 1 +
PS
PN
Lecture 10
11
Lecture 10
12
Transmitted power
SNRin =
PT
W PT
=
N o B B N oW
Equating gives:
B /W
SNRout = (1 + SNRin ) 1
Baseband SNR
SNRout = 1 +
Lecture 10
13
Analog performance
W
SNRbase
B
B /W
!
Lecture 10
14
Summary
"
"
' && (
$ ## %
"
"
Ideal performance
Lecture 10
Actual performance
15
Lecture 10
16