Information Lecture 1 Definitions

50 downloads 125858 Views 5MB Size Report
Reference books: ¡ B.P. Lathi, Modern Digital and Analog Communication Systems, ... how communication systems perform in the presence of noise. Objectives:.
Information Lecturer: Dr. Darren Ward, Room 811C Email: [email protected], Phone: (759) 46230

Aims: The aim of this part of the course is to give you an understanding of how communication systems perform in the presence of noise.

Objectives:

B.P. Lathi, Modern Digital and Analog Communication Systems, Oxford University Press, 1998 S. Haykin, Communication Systems, Wiley, 2001 L.W. Couch II, Digital and Analog Communication Systems, Prentice-Hall, 2001







By the end of the course you should be able to: Compare the performance of various communication systems Describe a suitable model for noise in communications Determine the SNR performance of analog communication systems Determine the probability of error for digital systems Understand information theory and its significance in determining system performance 









Reference books:



Course material: http://www.ee.ic.ac.uk/dward/

Lecture 1

2

2.



What is the course about, and how does it fit together Some definitions (signals, power, bandwidth, phasors)



1.

3

Definitions 

Lecture 1

Lecture 1

Signal: a single-valued function of time that conveys information Deterministic signal: completely specified function of time Random signal: cannot be completely specified as function of time

See Chapter 1 of notes

Lecture 1

4

Lecture 1

16







Definitions

Definitions

Analog signal: continuous function of time with continuous amplitude Discrete-time signal: only defined at discrete points in time, amplitude continuous Digital signal: discrete in both time and amplitude (e.g., PCM signals, see Chapter 4)

Instantaneous power: p=

v(t )

Average power:

2

= i (t ) R = g (t )

R

1 T →∞ T

P = lim

2

T /2

−T / 2

2

2

g (t ) dt

For periodic signals, with period To (see Problem sheet 1):

P=

Lecture 1

17

Definitions

1 To

To / 2

−To / 2

2

g (t ) dt

Lecture 1

18

Bandwidth

Bandwidth: extent of the significant spectral content of a signal for positive frequencies

Magnitude-square spectrum

3dB bandwidth

null-to-null bandwidth

Baseband signal, B/W = B

Lecture 1

0

Bandpass signal, B/W = 2B

19

Lecture 1

noise equivalent bandwidth

20

Phasors

General sinusoid:





Phasors

x(t ) = A cos( 2πft + θ )

{



x(t ) = ℜ Ae e

j 2π f t

x(t ) = A cos( 2πft + θ )

}

Lecture 1

x(t ) =

21

Phasors

Lecture 1

A jθ j 2π f t A − jθ − j 2π f t e e + e e 2 2

1.

32

Anti-clockwise rotation (positive frequency): exp( j 2π f t )



2.

The fundamental question: How do communications systems perform in the presence of noise? Some definitions: 

Clockwise rotation (negative frequency): exp(− j 2π f t )







A jθ j 2π f t A − jθ − j 2π f t e e + e e 2 2

Summary x(t ) =



Alternative representation:

Signals Average power

43

Lecture 1

T /2

−T / 2

2

x(t ) dt

Bandwidth: significant spectral content for positive frequencies Phasors – complex conjugate representation (negative frequency)

x(t ) =

Lecture 1

1 T →∞ T

P = lim

A jθ j 2π f t A − jθ − j 2π f t e e + e e 2 2

44

Lecture 2 1. 2.

Sources of noise

Model for noise Autocorrelation and Power spectral density See Chapter 2 of notes, sections 2.1, 2.2, 2.3

Lecture 2

1

Sources of noise





2.

2

Example





External noise synthetic (e.g. other users) atmospheric (e.g. lightning) galactic (e.g. cosmic radiation) Internal noise shot noise thermal noise

1.

Lecture 2

1. 2. 3.

Consider an amplifier with 20 dB power gain and a bandwidth of B = 20 MHz −11 Assume the average thermal noise at its output is Po = 2.2 ×10 W

What is the amplifier’s effective noise temperature? What is the noise output if two of these amplifiers are cascaded? How many stages can be used if the noise output must be less than mW?

20

P = kTB

Average power of thermal noise: Effective noise temperature:

Te =

Lecture 2

P kB

Temperature of fictitious thermal noise source at i/p, that would be required to produce same noise power at o/p 3

Lecture 2

4

Gaussian noise

Noise model 

Model for effect of noise is additive Gaussian noise channel:





Gaussian noise: amplitude of noise signal has a Gaussian probability density function (p.d.f.)



For n(t) a random signal, need more info: What happens to noise at receiver? Statistical tools 



Central limit theorem : sum of n independent random variables approaches Gaussian distribution as n→∞

Lecture 2

5

1

6

Random variable 

Motivation

Lecture 2

0

1

1

A random variable x is a rule that assigns a real number xi to the ith sample point in the sample space

0

Data



Data + noise

1 0

How often is a decision error made? Noise only

Error for 0 Error for 1

Lecture 2

7

Lecture 2

8

Statistical averages

Probability that the r.v. x is within a certain range is:





Probability density function

Expectation of a r.v. is:

E{x} = 

x2

P( x1 < x < x2 ) =  px ( x) dx

−∞





Example, Gaussian pdf:



2

p x ( x)

In general, if y = g(x) then



2 1 e −( x − m ) σ 2π

x px ( x) dx

where E{x} is the expectation operator

x1

px ( x) =



E{y} = E{g(x)} = 



−∞

g ( x) px ( x) dx

For example, the mean square amplitude of a signal is the mean of the square of the amplitude, ie,

{ }

E x2

Lecture 2

9

10

Averages 

Random process

Lecture 2

Time average:

1 T /2 n(t ) dt T →∞ T −T / 2



n(t ) = lim Ensemble average:

E{n} = 



−∞

n p(n) dn pdf





Ergodic

DC component:

Average power:

Stationary Lecture 2

11

Lecture 2

random variable

E{n(t )} = n(t )

{

}

E n 2 (t ) = n 2 (t ) = σ 2 (zero-mean Gaussian process only) 12

Autocorrelation

Consider the signal:

x(t ) = A e j (ω ct +θ )





Example

How can one represent the spectrum of a random process?

where θ is a random variable, uniformly distributed over 0 to 2π 1. 2.

Autocorrelation:

Calculate its average power using time averaging. Calculate its average power (mean-square) using statistical averaging.



Rx (τ ) = E{x (t ) x (t + τ )} NOTE: Average power is

{

}

P = E x 2 (t ) = Rx (0) Lecture 2

13

Original speech

14

Power spectral density 

Frequency content

Lecture 2

H(f)

PSD measures distribution of power with frequency, units watts/Hz

Wiener-Khinchine theorem:

Filtered speech



(smaller bandwidth)

Sx ( f ) =  Rx (τ ) e− j 2π fτ dτ −∞





= FT{Rx (τ )}

Hence,



Rx (τ ) =  S x ( f ) e j 2π fτ df −∞

Average power: ∞

P = Rx (0) =  S x ( f ) df −∞

Original LPF 4kHz Lecture 2

LPF 1kHz 15

Lecture 2

16

Summary

Thermal noise:

P= kTB = 

B

−B



S( f ) =

Autocorrelation:

Rx (τ ) = E{x (t ) x (t + τ )}

Power spectral density:



S x ( f ) =  Rx (τ ) e − j 2π fτ dτ −∞

No 2





White noise:

Additive Gaussian noise channel:

kT df 2 





Power spectral density



PSD is same for all frequencies

White noise:

S( f ) =

Expectation operator:

No 2

pdf

E{g ( x)} = 



−∞

Lecture 2

17

Lecture 2

[ g ( x)] px ( x) dx

random variable

18

Lecture 3

Analog communication system



Representation of band-limited noise Why band-limited noise? (See Chapter 2, section 2.4) Noise in an analog baseband system (See Chapter 3, sections 3.1, 3.2)

Lecture 3

1

Receiver

Lecture 3

2

Bandlimited Noise For any bandpass (i.e., modulated) system, the predetection noise will be bandlimited Bandpass noise signal can be expressed in terms of two baseband waveforms Predetection Filter

Detector

bandpass

Predetection filter: removes out-of-band noise has a bandwidth matched to the transmission bandwidth

baseband

carrier

baseband

carrier





n(t ) = nc (t ) cos(ωct ) − ns (t ) sin(ωct )



PSD of n(t) is centred about fc (and – fc) PSDs of nc(t) and ns(t) are centred about 0 Hz

Lecture 3

3

Lecture 3

6

PSD of n(t) 2W

nk (t ) = ak cos(ω k t + θ k ) let ω k = (ω k − ω c ) + ω c nk (t ) = ak cos[(ω k − ω c )t + θ k + ω ct ] A

B



use cos( A + B) = cos A cos B − sin A sin B nc (t ) term

In slice shown (for ∆f small):

nk (t ) = ak cos[(ω k − ω c )t + θ k ]cos(ω c t )

nk (t ) = ak cos( 2π f k t + θ k )

− ak sin[(ω k − ω c )t + θ k ]sin(ω c t ) ns (t ) term Lecture 3

7

Example – frequency

Lecture 3

Example – time W=1000 Hz

Lecture 3

8

W=1000 Hz

n(t)

n(t)

nc(t)

nc(t)

ns(t)

ns(t)

9

Lecture 3

10

Example – histogram

Probability density functions W=1000 Hz

n(t ) =  ak cos(ω k t + θ k ) k

nc (t ) =  ak cos[(ω k − ω c )t + θ k ]

n(t)

k

ns (t ) =  ak sin[(ω k − ω c )t + θ k ]

nc(t)

k







Each waveform is Gaussian distributed Central limit theorem Mean of each waveform is 0

ns(t)

Lecture 3

11

Lecture 3

Average power What is the average power in n(t) ?



Average power

12

Find using the power spectral density (PSD):



n(t ) =  ak cos(ω k t + θ k ) k

2W





Power in ak cos(ωt+θ) is E{ak2}/2 (see Example 1.1, or study group sheet 2, Q1)

Average power in n(t) is: Pn =  k

2

E{ak } 2 

nc (t ) =  ak cos((ω k − ω c )t + θ k )

From Lecture 2:

k

−∞

ns (t ) =  ak sin ((ω k − ω c )t + θ k ) Average power in nc(t) and ns(t) is: Pnc = 



k

2

2

13

f c +W

f c −W

E{ak } E{ak } , Pns =  2 2 k

n(t) , nc(t) and ns(t) all have same average power!

Lecture 3

= 2 (one for positive freqs, one for negative) 



k



P =  S ( f ) df No N df = 2 o (W − −W ) = 2 N oW 2 2

Average power in n(t) , nc(t) and ns(t) is: 2NoW

Lecture 3

14

Power spectral densities

Example – power spectral densities W=1000 Hz

n(t) PSD of n(t): nc(t)

2W

ns(t)

PSD of nc(t) and ns(t):

--W

W

Lecture 3

15

Example - zoomed

Lecture 3

16

Phasor representation W=1000 Hz

n(t ) = nc (t ) cos(ω c t ) − ns (t ) sin(ω ct ) n(t)

let g (t ) = nc (t ) + jns (t )

g (t ) e jω ct = nc (t ) cos ω c t + jnc (t ) sin ω c t + jns (t ) cos ω c t − ns (t ) sin ω c t

nc(t)

{

so n(t ) = ℜ g (t ) e jω ct

}

ns(t)

Lecture 3

17

Lecture 3

18



Analog communication system

Performance measure: SNRo =



Baseband system

average message power at receiver output average noise power at receiver output

Compare systems with same transmitted power

Lecture 3

19

20

Summary Baseband SNR:



Baseband SNR

Lecture 3

SNRbase =



Bandpass noise representation:

n(t ) = nc (t ) cos(ω c t ) − ns (t ) sin(ω ct )





Transmitted (message) power is: PT Noise power is:

No df = NoW −W 2 W

SNRbase = Lecture 3



All waveforms have same:

PT N oW





PN = 

SNR at receiver output:

PT N oW

21

Probability density function (zero-mean Gaussian) Average power

nc(t) and ns(t) have same power spectral density

Lecture 3

22

Lecture 4

Analog communication system









Noise (AWGN) in AM systems: DSB-SC AM, synchronous detection AM, envelope detection (See Chapter 3, section 3.3) Performance measure:



SNRo =

Lecture 4

1



Amplitude modulation

average message power at receiver output average noise power at receiver output

Compare systems with same transmitted power

Lecture 4

2

Amplitude modulation

Modulated signal: s (t ) AM = [Ac + m(t )]cos ω ct



m(t) is message signal Modulation index:

µ=

mp Ac

mp is peak amplitude of message

Lecture 4

3

Lecture 4

4

DSB-SC

Synchronous detection s(t )DSB-SC = Ac m(t ) cosωct



Signal after multiplier:

y (t ) AM = [Ac + m(t )]cos ω c t × 2 cos ω c t = [Ac + m(t )](1 + cos 2ω c t )

y (t ) DSB − SC = Ac m(t )(1 + cos 2ω c t ) Lecture 4

5

6

SNR of DSB-SC

Transmitted signal:





Noise in DSB-SC

Lecture 4

Output signal power:

Ps = ( Ac m(t ) ) = Ac2 P 2

s (t ) DSB-SC = Ac m(t ) cos ω c t 

power of message

Predetection signal:



Transmitted signal



x(t ) = Ac m(t ) cos ω c t + nc (t ) cos ω c t − ns (t ) sin ω c t



W

−∞

−W

PSD of bandpass noise n(t)

PN =  PSD df =  N o df = 2N oW

Bandlimited noise

PSD of nc(t)

Receiver output (after LPF): 

y (t ) = Ac m(t ) + nc (t )

Lecture 4

Output noise power:

Output SNR:

SNRDSB-SC = 7

Lecture 4

Ac2 P 2 N oW

PSD of baseband noise nc(t) 8

Noise in AM (synch. detector) 

SNR of DSB-SC

AP 2

Transmitted signal 

PT = ( Ac m(t ) cos ω c t ) 2 =

x(t ) = [Ac + m(t )]cos ω c t + nc (t ) cos ω c t − ns (t ) sin ω c t

2 c

SNRDSB-SC

Receiver output:

y (t ) = Ac + m(t ) + nc (t )

P = T = SNRbase N oW





Output SNR:

Output signal power:

PS = m 2 (t ) = P DSB-SC has no performance advantage over baseband





Bandlimited noise

Output noise power:





Transmitted power:

Predetection signal:

Output SNR:

SNRAM =

P 2 N oW

PN = 2 N oW Lecture 4

9



Noise in AM (synch. detector)

 

10

Noise in AM, envelope detector

Transmitted signal: 

s (t ) AM = [Ac + m(t )]cos ω c t

Predetection signal:

x(t ) = [Ac + m(t )]cos ω c t + nc (t ) cos ω c t − ns (t ) sin ω c t

Transmitted power: A2 P PT = c + 2 2

Transmitted signal

Bandlimited noise

Output SNR:

SNRAM = 

Lecture 4

PT P P SNRbase = 2 2 N oW Ac + P Ac + P

The performance of AM is always worse than baseband

Lecture 4

11

Lecture 4

13

Noise in AM, envelope detector

Receiver output:





Noise in AM, envelope detector

Small noise case:

y (t ) ≈ Ac + m(t ) + nc (t ) = output of synchronous detector

y (t ) = envelope of x(t) 

= [ Ac + m(t ) + nc (t )]2 + ns2 (t )

Large noise case:



y (t ) ≈ En (t ) + [ Ac + m(t )] cosθ n (t )

Lecture 4



y(t)

14





15

Summary

An unmodulated carrier (of amplitude Ac and frequency fc) and bandlimited white noise are summed and then passed through an ideal envelope detector. Assume the noise spectral density to be of height No/2 and bandwidth 2W, centred about the carrier frequency. Assume the input carrier-to-noise ratio is high.

Synchronous detector:

SNRDSB-SC = SNRbase SNRAM =

Calculate the carrier-to-noise ratio at the output of the envelope detector, and compare it with the carrier-to-noise ratio at the detector input.

P SNRbase A +P 2 c

Envelope detector: threshold effect for small noise, performance is same as synchronous detector 





1.

Lecture 4





Example

Envelope detector has a threshold effect Not really a problem in practice

Lecture 4

16

Lecture 4

17

Lecture 5

Frequency modulation FM waveform:



Noise in FM systems pre-emphasis and de-emphasis (See section 3.4)

s (t ) FM = Ac cos 2πf c t + 2πk f 

= Ac cos θ (t )

Comparison of analog systems (See section 3.5)

t

−∞ m(τ )

dτ  

θ(t) is instantaneous phase Instantaneous frequency: 1 dθ (t ) 2π dt = f c + k f m(t )

fi =

Frequency is proportional to message signal Lecture 5

1

FM waveforms

Lecture 5

2

FM frequency deviation Instantaneous frequency varies between fc-kf mp and fc+kf mp, where mp is peak message amplitude

Message:

Frequency deviation (max. departure of carrier wave from fc): ∆f = k f m p

AM:

Deviation ratio: ∆f β= W where W is bandwidth of message signal

FM:

Lecture 5

Commercial FM uses: ∆f=75 kHz and W=15 kHz 4

Lecture 5

5

Bandwidth considerations

FM receiver



Discriminator: output is proportional to deviation of instantaneous frequency away from carrier frequency



Carson’s rule:

BT ≈ 2W (β + 1) = 2(∆f + W )

Lecture 5

6

Noise in FM versus AM

Lecture 5

7

Noise in FM



Predetection signal:



FM:



Frequency of modulated signal carries message Zero crossings of modulated signal important Effect of noise should be less than in AM

Transmitted signal

x(t ) = Ac cos(2πf c t + φ (t ) ) + nc (t ) cos(2πf c t ) − ns (t ) sin(2πf c t ) where φ (t ) = 2π k f  m(τ ) dτ

t

−∞

 Lecture 5

Bandlimited noise









Amplitude of modulated signal carries message Noise adds directly to modulated signal Performance no better than baseband



AM:

8

If carrier power is much larger than noise power: 1. Noise does not affect signal power at output 2. Message does not affect noise power at output

Lecture 5

9

Assumptions

Assumptions 2. Signal does not affect noise at output

1. Noise does not affect signal power at output

Message-free component at receiver: xn (t ) = Ac cos(2πf ct ) + nc (t ) cos(2πf c t ) − ns (t ) sin(2πf c t )





Signal component at receiver:

xs (t ) = Ac cos(2πf c t + φ (t ) )



Instantaneous frequency: 1 dφ (t ) = k f m(t ) f i (t ) = 2π dt



Instantaneous frequency:

power of message signal

Lecture 5

10

1 d 2π Ac dt

ns (t )

f i (t )

X(f )

2

PSD of discriminator output:

x(t ) ⇔ X ( f ) dx(t ) ⇔ j 2π f X ( f ) dt

1 j 2π f 2π Ac

Y( f ) = H ( f )X ( f )

SY ( f ) = H ( f ) S X ( f )

SX ( f )

Ns ( f )

SN ( f ) Ns ( f )

H( f )





11



1 dns (t ) f i (t ) ≈ 2π Ac dt

Fourier theory property:

Lecture 5

PSD property:



Discriminator output:

Lecture 5

   

We know the PSD of ns(t), but what about its derivative??



PS = k 2f P

1 dθ (t ) 1 d ns (t ) 1 d ns (t ) ≈ tan −1 = 2π dt 2π dt Ac + nc (t ) 2π dt Ac





f i (t ) =



Output signal power:

j

f Ac

Fi ( f ) SF ( f ) =

| f |2 SN ( f ) 2 Ac

Fi ( f )

12

Lecture 5

13



PSD of LPF noise term: 1 dns (t ) | f |2 = 2 N o , | f |< W 2 Ac dt Ac



PSD of PSD of ns(t)



Average power of noise at output:

1 dns (t ) PSD of 2π A dt

=

0

2 N oW 3 2 3 Ac



Lecture 5

14

SNR of FM

Lecture 5

SNRFM is valid when the predetection SNR > 10 Predetection signal is:

 



3 Ac2 k 2f P

x(t ) = Ac cos(2πf c t + φ (t ) ) + nc (t ) cos(2πf c t ) − ns (t ) sin(2πf c t )

2 N oW 3



Predetection SNR is: SNR = pre



Transmitted power:

Ac2 = 2

Threshold point is:



PT = ( Ac cos[ω c t + φ (t )])

2



SNR at output:

3k 2f P W2

SNRbase =

3β 2 P SNRbase mp

Ac2 2 N o BT

Ac2 > 10 4 N oW ( β + 1)

Cannot arbitrarily increase SNRFM by increasing β



SNRFM =

15

Threshold effect in FM

SNR at output:

Lecture 5

W

Increasing the carrier power has a noise quieting effect

PSD after LPF

SNRo =

f3 3

   

| f |2 2N N o df = 2o PN =  2 −W Ac Ac W

16

Lecture 5

18

Pre-emphasis and De-emphasis

Example



The improvement in output SNR afforded by using preemphasis and de-emphasis in FM is defined by: SNR with pre - /de - emphasis SNR without pre - /de - emphasis average output noise power without pre - /de - emphasis = average output noise power with pre - /de - emphasis

I=



If Hde(f) is the transfer function of the de-emphasis filter, find an expression for the improvement, I.



Can improve output SNR by about 13 dB

Lecture 5

19

Analog system performance

20

Analog system performance

   



Parameters: Single-tone message m(t ) = cos(2π Message bandwidth W = f m AM system µ =1 FM system β =5

Lecture 5



SNRDSB − SC = SNRbase SNRAM = 13 SNRbase SNRFM = 752 SNRbase Lecture 5

Bandwidth:



Performance:

f mt )

BDSB − SC = 2W B AM = 2W BFM = 12W 21

Lecture 5

22

Summary

  



Noise in FM: Increasing carrier power reduces noise at receiver output Has threshold effect Pre-emphasis

  



Comparison of analog modulation schemes: AM worse than baseband DSB/SSB same as baseband FM better than baseband

Lecture 5

23

Lecture 6

Digital vs Analog

Digital communication systems Digital vs Analog communications Pulse Code Modulation

 

Analog message:

Digital message:

(See sections 4.1, 4.2 and 4.3)

Lecture 6

1

Digital vs Analog Digital: Decide which symbol was sent Performance criterion is probability of receiver making a decision error

        

 

Recreate waveform accurately Performance criterion is SNR at receiver output

2

Sampling: discrete in time

 

Analog:

Lecture 6

Advantages of digital: 1. 2.

Lecture 6

Digital signals are more immune to noise Repeaters can re-transmit a noise-free signal

Nyquist Sampling Theorem: A signal whose bandwidth is limited to W Hz can be reconstructed exactly from its samples taken uniformly at a rate of R>2W Hz.

3

Lecture 6

4

Maximum information rate

Pulse-code modulation Represent an analog waveform in digital form

Channel B Hz

   

How many bits can be transferred over a channel of bandwidth B Hz (ignoring noise)? Signal with a bandwidth of B Hz is not distorted over this channel Signal with a bandwidth of B Hz requires samples taken at 2B Hz Can transmit: 2 bits of information per second per Hz

Lecture 6

5

Quantization: discrete in amplitude

Lecture 6

6

Encode



Round amplitude of each sample to nearest one of a finite number of levels

Lecture 6



Assign each quantization level a code

7

Lecture 6

8

Sampling vs Quantization

Quantization noise

Sampling:

 

Non-destructive if fs>2W Can reconstruct analog waveform exactly by using a low-pass filter Sampled signal

Quantization:

 

Destructive Once signal has been rounded off it can never be reconstructed exactly

Quantized signal (step size of 0.1)

Quantization error

Lecture 6

9

Quantization noise

Lecture 6

10

Quantization noise

Let ∆ be the separation between quantization levels ∆=

 



Let message power be P Noise power is: PN = E{q 2 } (since zero mean) 2 2m p m 2p ∆2 L = = = 12 12 3 × 22n Output SNR of quantizer:

2m p

( )

L

is the no. of quantization levels where mp is peak allowed signal amplitude



L=2n



Round-off effect of quantizer ensures that |q|< ∆/2, where q is a random variable representing the quantization error

or in dB:



Assume q is zero mean with uniform pdf, so mean square error is: ∞

SNRQ = 6.02n + 10 log10

E{q 2 } =  q 2 p (q) dq −∞

=

∆/2

−∆ / 2

Lecture 6

q2

1 ∆

dq =

PS 3P 2 n = ×2 PN m 2p

  

SNRQ =

∆2 12 11

Lecture 6

3P m 2p

dB

12

Bandwidth of PCM

Nonuniform quantization



For audio signals (e.g. speech), small signal amplitudes occur more often than large signal amplitudes Better to have closely spaced quantization levels at low signal amplitudes, widely spaced levels at large signal amplitudes Quantizer has better resolution at low amplitudes (where signal spends more time)



Each message sample requires n bits If message has bandwidth W Hz, then PCM contains 2nW bits per second

Bandwidth required is:

BT = nW

SNR can be written: SNRQ =

3P 2 BT / W ×2 m 2p

Small increase in bandwidth yields a large increase in SNR Uniform quantization

Lecture 6

13

Nonuniform quantization

Lecture 6

Non-uniform quantization 15

Companding



Uniform quantizer is easier to implement that nonlinear Compress signal first, then use uniform quantizer, then expand signal (i.e., compand)

Lecture 6

16

Lecture 6

17

Summary

     

Analog communications system: Receiver must recreate transmitted waveform Performance measure is signal-to-noise ratio Digital communications system: Receiver must decide which symbol was transmitted Performance measure is probability of error



Pulse-code modulation:



Scheme to represent analog signal in digital format Sample, quantize, encode Companding (nonuniform quantization)

Lecture 6

18

Lecture 7

Digital receiver

     

   

Performance of digital systems in noise: Baseband ASK PSK, FSK Compare all schemes Filter + Demodulator

s(t)

(See sections 4.4, 4.5)

Lecture 7

1

Baseband digital system

Lecture 7

2

Gaussian noise, probability Noise waveform, n(t)

t

     

s(t)

Probability density function, f(n)





  

1 ( n − m) 2 exp − 2σ 2 σ 2π = (m, σ 2) Normal distribution

p ( n) =

b

mean, m variance, σ2

prob(a < n < b) =  p(n) dn a

Lecture 7

3

Lecture 7

4

Gaussian noise, spectrum

Baseband system – “0” transmitted NOTE: For zero-mean noise, variance ≡ average power i.e., σ 2=P

White noise

Transmitted signal s0(t)

Noise signal n(t)

LPF (use for baseband) W

P=

−W

Received signal y0(t)= s0(t)+n(t)

− f c +W N No o df +  df − f c −W f c −W 2 2 = 2 N oW f c +W

Lecture 7

5

Baseband system – “1” transmitted

Error if yo (t ) > A / 2

Lecture 7

A/ 2

N(0, σ 2 ) dn 6

Possible errors: 1. Symbol “0” transmitted, receiver decides “1” 2. Symbol “1” transmitted, receiver decides “0” Total probability of error:

Pe = po Pe 0 + p1 Pe1

Received signal y1(t)= s1(t)+n(t)

−∞

A/ 2

Baseband system – errors

Noise signal n(t)

Pe1 = 



Lecture 7

Transmitted signal s1(t)

Error if y1 (t ) < A / 2

Pe 0 = 



P=



BPF (use for bandpass)

No df = N oW 2

Probability of “0” being sent

Probability of making an error if “0” was sent

N ( A, σ 2 ) dn 7

Lecture 7

8

1 n2 exp − dn A/ 2 2σ 2 σ 2π

Pe = 



For equally-probable symbols: 1 1 Pe = Pe 0 + Pe1 2 2 Can show that Pe0 = Pe1



1. Complementary error function (erfc in Matlab) ∞

u

2. Q-function (tail function) 1 2π A Pe = Q 2σ

Q(u ) =



A/ 2

N (0, σ 2 ) dn

Lecture 7



u

exp

− n2 dn 2

%"$ & !" #



Hence, Pe = ½ Pe0+ ½ Pe0= Pe0

Pe = 

A σ2 2

%'$ $ & !' #

Pe1

exp( −n 2 ) dn

 

2 π

 



erfc(u ) = Pe = 12 erfc

Pe0

  

How to calculate Pe?

  

Baseband system – errors

9

Baseband error probability

Lecture 7

10

Example

(

(

Consider a digital system which uses a voltage level of 0 volts to represent a “ 0” , and a level of 0.22 volts to represent a “ 1” . The digital waveform has a bandwidth of 15 kHz. If this digital waveform is to be transmitted over a baseband channel having additive noise with flat power spectral density of No/2=3 x 10-8 W/Hz, what is the probability of error at the receiver output?

Lecture 7

11

Lecture 7

12

Amplitude-shift keying

Synchronous detector

(

Identical to analog synchronous detector

s0 (t ) = 0 s1 (t ) = A cos(ω c t ) Lecture 7

13

ASK – “ 0” transmitted

Lecture 7

ASK – “ 1” transmitted

)

Predetection signal:

)

Predetection signal:

x1 (t ) = A cos(ω c t ) + nc (t ) cos(ω c t ) − ns (t ) sin(ω c t )

x0 (t ) = 0 + nc (t ) cos(ω c t ) − ns (t ) sin(ω c t ) ASK “ 0”

14

Bandpass noise

ASK “ 1”

After multiplier:

Bandpass noise

Receiver output: y0 (t ) = nc (t )

Receiver output: y1 (t ) = A + nc (t )

Lecture 7

)

)

)

)

r0 (t ) = x0 (t ) × 2 cos(ω c t ) = nc (t ) 2 cos 2 (ω c t ) − ns (t )2 sin(ω c t ) cos(ω c t ) = nc (t )[1 + cos( 2ω c t )] − ns (t ) sin( 2ω c t )

After multiplier: r1 (t ) = x1 (t ) × 2 cos(ω c t ) = [ A + nc (t )] 2 cos 2 (ω c t ) − ns (t ) 2 sin(ω c t ) cos(ω c t ) = [ A + nc (t )][1 + cos( 2ω c t )] − ns (t ) sin( 2ω c t )

15

Lecture 7

16

PDFs at receiver output ASK – “ 1” transmitted:

)

)

ASK – “ 0” transmitted:

Phase-shift keying

PDF of y0 (t ) = nc (t )

PDF of y1 (t ) = A + nc (t ) s0 (t ) = − A cos(ω c t ) s1 (t ) = A cos(ω c t )

PSK

FSK

Pe,ASK = 12 erfc

+**, 

.--/ 

)

Same as baseband!

A σ2 2

Lecture 7

17

PSK demodulator

Lecture 7

18

PSK – “ 0” transmitted

1

Predetection signal: x0 (t ) = − A cos(ω c t ) + nc (t ) cos(ω c t ) − ns (t ) sin(ω c t ) PSK “ 0”

Bandpass noise

1

After multiplier: r0 (t ) = x0 (t ) × 2 cos(ω c t ) = [− A + nc (t )] 2 cos 2 (ω c t ) − ns (t ) 2 sin(ω c t ) cos(ω c t ) = [− A + nc (t )][1 + cos( 2ω c t )] − ns (t ) sin( 2ω c t )

0 0 0

Band-pass filter bandwidth matched to modulated signal bandwidth Carrier frequency is ωc Low-pass filter leaves only baseband signals

Lecture 7

1

Receiver output: y0 (t ) = − A + nc (t )

19

Lecture 7

20

PSK – PDFs at receiver output

PSK “ 1”

1

PSK – “ 0” transmitted:

1

Predetection signal: x1 (t ) = A cos(ω c t ) + nc (t ) cos(ω c t ) − ns (t ) sin(ω c t )

PDF of y0 (t ) = − A + nc (t )

Bandpass noise

PSK – “ 1” transmitted:

1

PSK – “ 1” transmitted

PDF of y1 (t ) = A + nc (t )

1

After multiplier: r1 (t ) = x1 (t ) × 2 cos(ω c t ) = [ A + nc (t )] 2 cos 2 (ω c t ) − ns (t ) 2 sin(ω c t ) cos(ω c t ) = [ A + nc (t )][1 + cos( 2ω c t )] − ns (t ) sin( 2ω c t )

1

Set threshold at 0:

1

Receiver output: y1 (t ) = A + nc (t )

if y < 0, decide "0" if y > 0, decide "1"

Lecture 7

21

PSK – probability of error

Lecture 7

22

Frequency-shift keying

PSK ∞

0

432 2 5

0

476 6 8

=



− (n + A) 2 1 exp dn 2σ 2 σ 2π

−∞

4:9 9 ;

Pe1 =  N( A, σ 2 )

0

4=< < >

Pe 0 =  N(− A, σ 2 )

− (n − A) 2 1 exp dn −∞ 2σ 2 σ 2π

=

0

FSK

DAC E @A? B

F

Probability of bit error: A 1 Pe,PSK = erfc 2 σ 2

Lecture 7

23

Lecture 7

s0 (t ) = A cos(ω 0t ) s1 (t ) = A cos(ω1t )

24

FSK detector

FSK

F

Receiver output:

[

y0 (t ) = − A + n1c (t ) − nc0 (t )

[

y1 (t ) = A + nc1 (t ) − nc0 (t )

s0 (t ) = A cos(ω 0t ) s1 (t ) = A cos(ω1t )

]

]

Independent noise sources, variances add

F

PDFs same as for PSK, but variance is doubled:

Lecture 7

25

Digital performance comparison

A 1 erfc 2 2σ

LIK M HIG J

Pe,FSK =

Lecture 7

26

Summary

N

For a baseband (or ASK) system:

Pe = 



A/ 2



A    σ 2 2  

N (0, σ 2 ) dn = 12 erfc

LIK M HIG J

DAC E @A? B

F

Probability of error for PSK and FSK: A 1 A 1 Pe,PSK = erfc Pe,FSK = erfc 2 σ 2 2 2σ

F

Comparison of digital systems: PSK best, then FSK, ASK and baseband same

Lecture 7

27

Lecture 7

28

Lecture 8

Why information theory?

   

Information theory Why? Information Entropy Source coding (a little)

(See sections 5.1 to 5.4.2)



What is the performance of the “best” system? “What would be the characteristics of an ideal system, [one that] is not limited by our engineering ingenuity and inventiveness but limited rather only by the fundamental nature of the physical universe” Taub & Schilling

Lecture 8

1

Information

Lecture 8

2

Properties of I(s)

The purpose of a communication system is to convey information from one point to another What is information?

I(s)

I ( s ) = − log 2 ( p )

    

Definition: I ( s ) = log 2

Information in symbol s

Lecture 8

1 = − log 2 ( p ) p

Probability of occurrence of symbol s

bits p

bits

1.

2.

Conventional unit of information

3.

3

Lecture 8

If p=1, I(s)=0 (symbol that is certain to occur conveys no information) 0