Tuesday, October 4, 2011

DIGITAL COMMUNICATIONS previous year question paper for jntu university for 3rd year first semster ECE and EET departments

DIGITAL COMMUNICATIONS previous year question paper for jntu university for 3rd year first semster ECE and EET departments

DIGITAL COMMUNICATIONS previous year question paper for jntu university for 3rd year first semster ECE and EET departments

III B.Tech I Semester Supplimentary Examinations, February 2008
DIGITAL COMMUNICATIONS
( Common to Electronics & Communication Engineering and Electronics &
Telematics)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆ ⋆ ⋆ ⋆ ⋆
1. (a) State and prove the sampling theorem for band pass signals.
(b) A signal m(t) = cos(200πt) + 2 cos(320πt) is ideally sampled at fS = 300Hz.
If the sampled signal is passed through a low pass filter with a cutoff frequency
of 250Hz. What frequency components will appear in the output? [6+10]
2. (a) Derive an expression for channel noise and quantization noise in DM system.
(b) Compare DM and PCM systems. [10+6]
3. (a) Draw the signal space representation of MSK.
(b) Show that in a MSK signaling scheme, the carrier frequency in integral mul-
tiple of ‘fb/4’ where ‘fb’ is the bit rate.
(c) Bring out the comparisons between MSK and QPSK.
4. (a) Derive an expression for error probability of non - coherent ASK scheme.
(b) Binary data is transmitted over an RF band pass channel with a usable band-
width of 10MHz at a rate of 4,8 × 106 bits/sec using an ASK singling method.
The carrier amplitude at the receiver antenna is 1mV and the noise power
spectral density at the receiver input is 10−15w/Hz.
i. Find the error probability of a coherent receiver.
ii. Find the error probability of a coherent receiver. [8+8]
5. Figure 5 illustrates a binary erasure channel with the transmission probabilities
probabilities P(0|0) = P(1|1) = 1 − p and P(e|0) = P(e|1) = p. The probabilities
for the input symbols are P(X=0) =α and P(X=1) =1- α.
Determine the average mutual information I(X; Y) in bits. [16]
6. Show that H(X, Y) = H(X) + H(Y |X) = H(Y ) + H(X|Y ). [16]
7. Explain about block codes in which each block of k message bits encoded into block
of n>k bits with an example. [16]
8. Explain various methods for describing Conventional Codes. [16]

III B.Tech I Semester Supplimentary Examinations, February 2008
DIGITAL COMMUNICATIONS
( Common to Electronics & Communication Engineering and Electronics &
Telematics)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆ ⋆ ⋆ ⋆ ⋆
1. (a) State sampling theorem for low pass signals and band pass signals.
(b) What is aliasing effect? How it can be eliminated? Explain with neat diagram.
[4+4+8]
2. (a) Derive an expression for channel noise and quantization noise in DM system.
(b) Compare DM and PCM systems. [10+6]
3. Explain the design and analysis of M-ary signaling schemes. List the waveforms in
quaternary schemes. [16]
4. (a) Derive an expression for error probability of coherent PSK scheme.
(b) In a binary PSK scheme for using a correlator receiver, the local carrier wave-
form is Acos (wct + ϕ) instead of Acos(wct) due to poor carrier synchroniza-
tion. Derive an expression for the error probability and compute the increase
in error probability when ϕ=150 and [A2Tb/η] = 10. [8+8]
5. Consider the transmitting Q1, Q2, Q3, and Q4 by symbols 0, 10, 110, 111
(a) Is the code uniquely decipherable? That is for every possible sequence is there
only one way of interpreting message.
(b) Calculate the average number of code bits per message. How does it compare
with H = 1.8 bits per messages. [16]
6. Show that H(X, Y) = H (X) + H (Y) and H(X/Y) = H (X). [16]
7. Explain about block codes in which each block of k message bits encoded into block
of n>k bits with an example. [16]
8. Explain various methods for describing Conventional Codes. [16]


III B.Tech I Semester Supplimentary Examinations, February 2008
DIGITAL COMMUNICATIONS
( Common to Electronics & Communication Engineering and Electronics &
Telematics)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆ ⋆ ⋆ ⋆ ⋆
1. The probability density function of the sampled values of an analog signal is shown
in figure 1.


Figure 1
(a) Design a 4 - level uniform quantizer.
(b) Calculate the signal power to quantization noise power ratio.
(c) Design a 4 - level minimum mean squared error non - uniform quantizer.
[6+4+6]
2. A DM system is tested with a 10kHz sinusoidal signal, 1V peak to peak at the
input. The signal is sampled at 10times the Nyquist rate.
(a) What is the step size required to prevent slope overload and to minimize the
granular noise.
(b) What is the power spectral density of the granular noise?
(c) If the receiver input is band limited to 200kHz, what is the average (S/NQ).
[6+5+5]
3. (a) Write down the modulation waveform for transmitting binary information over
base band channels, for the following modulation schemes: ASK, PSK, FSK
and DPSK.
(b) What are the advantages and disadvantages of digital modulation schemes?
(c) Discuss base band transmission of M-ary data. [4+6+6]
4. (a) Draw the block diagram of band pass binary data transmission system and
explain each block.
1 of 2
Code No: R05310404 Set No. 3
(b) A band pass data transmitter used a PSK signaling scheme with
s1(t) =?A coswct; 0 t Tb
s2(t) = +A coswct; 0 t Tb
Where Tb = 0.2msec; wc = 10π /Tb.
The carrier amplitude at the receiver input is 1mV and the power spectral
density of the additive white gaussian noise at the input is 10−11w/Hz. Assume
that an ideal correlation receiver is used. Calculate the average bit error rate
of the receiver. [8+8]
5. A Discrete Memory less Source (DMS) has an alphabet of five letters, xi, i =1,
2,3,4,5, each occurring with probabilities 0.15, 0.30, 0.25, 0.15, 0.10, 0.08, 0.05,
0.05.
(a) Determine the Entropy of the source and compare with it N.
(b) Determine the average number N of binary digits per source code. [16]
6. (a) Calculate the bandwidth limits of Shannon-Hartley theorem.
(b) What is an Ideal system? What kind of method is proposed by Shannon for
an Ideal system? [16]
7. Explain about block codes in which each block of k message bits encoded into block
of n>k bits with an example. [16]
8. Sketch the Tree diagram of convolutional encoder shown in figure 8 with Rate=
1/2, constraint length L = 2. [16]
Figure 8

III B.Tech I Semester Supplimentary Examinations, February 2008
DIGITAL COMMUNICATIONS
( Common to Electronics & Communication Engineering and Electronics &
Telematics)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆ ⋆ ⋆ ⋆ ⋆
1. (a) State and prove the sampling theorem for band pass signals.
(b) A signal m(t) = cos(200πt) + 2 cos(320πt) is ideally sampled at fS = 300Hz.
If the sampled signal is passed through a low pass filter with a cutoff frequency
of 250Hz. What frequency components will appear in the output? [6+10]
2. (a) Explain with a neat block diagram the operation of a continuously variable
slope delta modulator (CVSD).
(b) Compare Delta modulation with Pulse code modulation technique. [8+8]
3. (a) Assume that 4800bits/sec. random data are sent over a band pass channel by
BFSK signaling scheme. Find the transmission bandwidth BT such that the
spectral envelope is down at least 35dB outside this band.
(b) Write the comparisons among ASK, PSK, FSK and DPSK. [8+8]
4. (a) What is meant by ISI? Explain how it differs from cross talk in the PAM.
(b) What is the ideal solution to obtain zero ISI and what is the disadvantage of
this solution. [6+10]
5. A code is composed of dots and dashes. Assume that the dash is 3 times as long
as the dots, has one-third the probability of occurrence.
Calculate
(a) the Information in a dot and that in a hash.
(b) average Information in the dot-hash code.
(c) Assume that a dot lasts for 10 ms and that this same time interval is allowed
between symbols. Calculate average rate of Information. [16]
6. Explain Shannon-Fano algorithm with an example. [16]
7. Explain about block codes in which each block of k message bits encoded into block
of n>k bits with an example. [16]
8. Sketch the Tree diagram of convolutional encoder shown in figure 8 with Rate=
1/2, constraint length L = 2. [16]

No comments:

Post a Comment