WhatsApp Group Join Now
Telegram Join Join Now

Find the differential entropy H(X) of the uniformly distributed random variable X with the following probability density function

EXAMPLE 9.32. Find the differential entropy H(X) of the uniformly distributed random variable X with the following probability density function (pdf):
equation
            for (i) a = 1, (ii) a = 2, and (iii) a =  .
Solution: We know that the differential entropy of X is given by
equation
Making use of given pdf, we have
equation
Now, we have
(i)         a = 1 , H(X) = log2 1 = 0
(ii)        a = 2 , H(X) = log2 2 = 0
(iii)       a =  , H(X) = log2   = – log2 = – 1
It may be noted that the differential entropy H(X) is not an absolute measure of information.
EXAMPLE 9.33. The differential entropy of a random variable X is defined by following equation:
equation
Find the probability density function fX(x) for which H(X) is maximum.
Solution: We know that fX(x) must satisfy the following two conditions:
equation
equation
where μ is the mean of X and σ2 it its variance. Since the problem is the maximization of H(X) under constraints of equations (i) and (ii), therefore, we use the method of Lagrange multipliers as under:
First, we form the function:
equation
equation
where the parameters l1, and l2 are the Lagrange multipliers. Then the maximization of H(X) requires that
equation
Thus,                                       log2 fX(x) – log2e + l1 + l2 (x-μ)2
or                                             equation
Hence, we obtain
equation
In view of the constraints of equations (i) and (ii), it is required that l2 < 0.
equation
Then, equation (v) can be rewritten as
fX(x) = ae-b2(x -μ)2
Substituting equation (vi) into equations (i) and (ii), we get
equation
equation
Solving equations (vii) and (Viii) for a and b2, we get
a =  and b2 =
Substituting these values in equation (vi), we observe that the desired fX(x) is given by
equation
which is the probability density function of Gaussian random variable X of mean μ and variance σ2.
EXAMPLE 9.34. Show that the channel capacity of an ideal AWGN channel with infinite bandwidth is given by
equation
            where S is the average signal power and Ƞ/2 is the power spectral density (psd) of white gaussian noise.                    (U.P.S.C. I.E.S. Examination-1999)
Solution: We know that the noise power N is given by N = ȠB.

DO YOU KNOW?
A second type of noise, impulse noise, is also encountered in the channel. Impulse noise is characterized by long quiet intervals followed by bursts of high amplitude noise pulses.

Also, according to Shannon-Hartley law, the channel capacity is given by
C = B log2  b/s
In this expression, substituting N = ȠB, we get
C = B log2
Let S/(ȠB) = l. Then, we write
equation
Now                                          Equation
equation
NOTE: It may be noted that equation (i) can be used to estimate upper limits on the performance of any practical communication system whose transmission channel can be approximated by the AWGN channel.
EXAMPLE 9.35. Given an AWGN channel with 4 kHz bandwidth and the noise power spectral density Ƞ/2 = 10-12 W/Hz. The signal power required at the receiver is 0.1 mW. Calculate the capacity of this channel.
(U.P. Tech, Sem. Exam., 2005-2006)
Solution:         Given that       B = 4000 Hz
S = 0.1(10-3)W
N = ȠB = 2(10-12)(4000) = 8(10-9) W
Thus,                            =  = 1.25(104)
And, by equation (14.50), we have
C = B log2  = 4000 log2 (1 + 1.25 (104)]
C = 54.44 (103) b/s                 Ans.
EXAMPLE 9.36. An Analog signal having 4 kHz bandwidth is sampled at 1.25 times the Nyquist rate, and each sample is quantized into one of equally likely levels. Assume that the successive samples are statistically independent.
            (i)         What is the information rate of this source?
            (ii)        Can the output of this source be transmitted without error over an AWGN channel with a bandwidth of 10 kHz and an S/N ratio of 20 dB?
            (iii)       Find the S/N ratio required for error-free transmission for part (i).
            (iv)       Find the bandwidth required for an AWGN channel for error-free transmission of the output of this source if the S/N ratio is 20 dB?                                                (U.P.S.C. I.E.S. Examination, 1999)
Solution: (i) Here,                   fm = 4(103)Hz
We know that Nyquist rate fs is given by
fs = 2fm
Nyquist rate = 2fm = 8(103) samples/s
Also, we have             r = 8(103)(1.25) = 104 samples/s
Further, we know that entropy is expressed as
equation
Here                                        P(xi) =
Hence,                                     H(X) = log2 256 = 8 bits/sample
The information rate R of the source is given by
R = rH(X) = 104(8) b/s = 80 kb/s        Ans.
            (ii) Again, we know that channel capacity is given by
C = B log2  bit/s
Hence,             C = B log2  104log2 (1 + 102) = 66.6 (103) b/s

DO YOU KNOW?
The output of a discrete information sources is a message that consists of a sequence of symbols. The actual message that is emitted by the source during a message interval is selected at random form a set of possible messages.

Here, since R > C, error-free transmission is not possible.
(iii) The required S/N ratio can be found by
C = 104 log2  ≥ 8 (104).
or             log2  ≥ 8
or                      ≥ 28 = 256
or                                  ≥ 255 ( = 24.1 dB)               Ans.
Thus, the required S/N ratio must be greater than or equal to 24.1 dB for error-free transmission
(iv)       The required bandwidth B can be found by
C = B log2 (1 + 100) > 8(104)
or                                             equation
and the required bandwidth of the channel must be greater than or equal to 12 kHz.                      Ans.

Leave a Reply

Your email address will not be published. Required fields are marked *