WhatsApp Group Join Now
Telegram Join Join Now

CUMULATIVE DISTRIBUTION FUNCTION (CDF) , Properties , DISCRETE RANDOM VARIABLE

2.8 PROBABILITY FUNCTION OR PROBABILITY DISTRIBUTION OF A DISCRETE RANDOM VARIABLE
Let X be a discrete random variable and also let x1, x2, x3, … be the values that X can take.
Then                                        P (X = xj) = f (xj)                     …(2.15)
where                                       j = 1, 2, 3, ……
will be the probability of xj.
This f(xj) or simply f(x) is called the probability function or probability distribution of the discrete random variable.
EXAMPLE 2.10. In an experiment, three coins are tossed simultaneously. If the number of heads is the random variable, find the probability function for this random variable.
Solution: In the three tosses of coins, there are eight possible outcomes. This is the sample space S. Let the random variable, number of heads, be X.
Since, in the given experiment, the random variable X, number of heads, takes only finite values, this is a discrete random variable.
Writing the sample space S and values of discrete random variable X as
S = { HHH  HHT HTH THH HTT THT TTH TTT}
X = { x1  x2   x3  x4  x5  x6  x7  x8}
{ 3   2    2   2   1   1   1   0}

DO YOU KNOW?
The notion of random variables provides us the power of abstraction and thus allows us to discard unimportant details in the outcome of an experiment.

Since the coin isfair, the probability of each of 8 possible outcomes, will be 1/8.
Now,                                       Equation
Equation
Equation
Therefore, the probability function for discrete random variable X is as given below:

X 0 1 2 3
f(x)

2.9 CUMULATIVE DISTRIBUTION FUNCTION (CDF)
          The Cumulative Distribution Function (CDF) of a random variable ‘X’ may be defined as the probability that a random variable ‘X’ takes a value less than or equal to x. Here x is the dummy variable. In other words, the comulative distribution function (CDF) provides probabilistic description of a random variable.
Let us consider the probability of the event X. The probability of this event may be denoted a P (X.
Now, according to the definition, the cumulative distribution function (CDF) may be written as
….(2.16)
where Fx(x) is called cumulative distribution function (CDF) of a random variable X.
From equation (2.16), it may be observed that Fx(x) is a function of dummy variable ‘x’ i.e. the value taken by random variable X.
NOTE :      It may also be noted that for any value of ‘x’, the CDF represents a probability. The CDF may be defined for discrete as tvell as continuous random variables. Because cumulative distribution function (CDF) basically represents the probability of random variable ‘X’ for event X, it is also called probability distribution function of the random variable or simply distribution function of the random variable. The CDF is sometimes also called cumulative probability distribution function.
2.9.1. Properties of Cumulative Distribution Function (CDF)
          The properties of CDF may be listed as under:
Property 1: Since cumulative distribution function (CDF) is the probability distribution function i.e. it is defined as the probability of event (X < x), its value is always between 0 and 1. This means that CDF is bounded between 0 and 1. Mathematically,
0 < Fx(x) < 1                                              …(2.17)
Property 2:                                                                             …(2.18)
and                                                                                    …(2.19)
For first case, x = –  means no possible event. Due to this fact,  P(X < –  will always be zero. Therefore Fx(-) = 0.
For second case, x =  means P(X < ). Since P(X < ) includes probability of all possible events and the probability of a certain event is ‘1’ therefore
Fx() = 1
Property 3: Fx (x1) < Fx(x2)               if x1 <  x2                                     …(2.20)
The above property states that the CDF, Fx(x) is a monotone non-decreasing function of x.
PAGE NO. 51 TO 57
EQUATION
The joint Distribution Function or joint CDF Fxy (x, y) of two random variables X and Y is defined as the probability that the random variable X is less than or equal to a specified value x and that the random variable y is less than or equal to a specified value y.
The joint Cumulative Distribution Function may be defined systematically as:
The joint Cumulative Distribution Function Fxy (x, y) may be defined as the probability that the outcome of an experiment will result in a sample point lying inside the range (-  < X < x, –  < Y < y) of the joint sample space. The joint sample space is the combined sample space of X and Y. Mathematically,
FXY (x,y) = P(X< x, Y < y)                                   …(2.28)
2.11.1. Properties of Joint Cumulative Distribution Function
          The Joint Cumulative Distribution Function has the following properties:
Property 1: The Joint Cumulative Distribution Function is a non-negative function. Mathematically,
> 0                                               …(2.29)
The Joint Cumulative Distribution Function is basically defined as the probability in the Joint sample space of random variables. We know that the probability always lies between 0 and 1. Therefore, the joint Cumulative Distribution Function also lies between 0 and 1 and hence non-negative.
Property 2: The Joint Cumulative Distribution Function is a monotone non-decreasing function of both x and y.
Property 3: The Joint Cumulative Distribution Function is always continuous everywhere in the xy-plane.
2.12 THE JOINT PROBABILITY DENSITY FUNCTION
The Joint Probability Density Function or simply Joint PDF is the PDF of two or more random variables.
The joint PDF of any two random variables X and Y may be defined as the partial derivative of the joint cumulative distribution function Fxy (x, y) with respect to the dummy variables x and y. Mathematically,
Equation
Here, we take partial derivative since the differentiation is with respect to two variables x and y.
2.12.1. Properties of Joint PDF
The Joint PDF has the following properties:
Property 1: The Joint PDF is non-negative. Mathematically,
fXY ( x,y) > 0                                     ….(2.31)
Because the joint PDF is a derivative of a non-negative function (Joint CDF), therefore its value is always positive.
Property 2: The total volume under the surface of joint PDF is equal to unity.
PAGE NO 59 to 86
EQUATION
SUMMARY
■       There is one other class of signals, the behaviour of which cannot be predicted. Such type of signals are called random signals.
■       These signals are called random signals because the precise value of these signals cannot be predicted in advance before they actually occur. The examples of random signals are the noise interferences in communication systems.
■       An experiment is defined as the process which is conducted to get some results. If the same experiment is performed repeatedly under the same conditions, similar results are expected.
■       The outcomes of an event are called equally likely, if any one of them cannot be expected to occur in preference to another. As an example, the tossing of a coin results in two outcomes, Head and Tail. Both have same possibility of 50%. Such type of outcomes are called equally likely outcomes.
■       Probability may be defined as the study of random experiments. In any random experiment, there is always an uncertainty that a particular event will occur or not.
■       The concept of conditional probability is used in conditional occurrences of the events.
■       The conditional probability of event B given that event A has already happened.
Equation
where P (AB) is the joint probability of A and B.
Similarly, the conditional probability of event A given that event B has already happened.
Equation
The joint probability has commutative property which states that
P (AB) = P (BA)
■       If A and B are two events in an experiment, and possibility of occurrence of event B does not depend upon occurrence of event A, then these two events A and B are known as statistically independent events.
■       A function which can take on any value from the sample space and its range is some set of real numbers is called a random variable of the experiment.
■       Random variables may be classified as under:

  1. Discrete random variables
  2. Continuous random variables.

■       A discrete random variable may be defined as the random variable which can take on only finite number of values in a finite observation interval. This means that the discrete random variable has countable number of distinct values.
■       A random variable that takes on an infinite number of values is called a continuous random vitriol& Actually, there are several physical system (experiments) that generate continuous outputs or outcomes. Such systems generate infinte number of outputs or outcomes within the finite period.
■       The Comulative Distribution Function (CDF) of a random variable ‘X’ may be defined as the probability that a random variable ‘X’ takes a value less than or equal to x.
■       The derivative of cumulative distribution function (CM’) with respect to some dummy variable is known as Probability Density Function (PDF). Probability density function (PDF) is generally denoted by fx(x). Mathematically, PDF may be expressed as
EQUATION
where x is a dummy variable.
Probability density function (PDF) is the more convenient representation for continuous random variable.
■       The joint Cumulative Distribution Function may be defined systematically as:
The joint Cumulative Distribution Function Fxy (x, y) may be defined as the probability that the outcome of an experiment will result in a sample point lying inside the range (- < X < x, –  < Y < y) of the joint sample space. The joint sample space is the combined sample space of X and Y. Mathematically,
Fxy(x, y) = P(X < x, Y < y)
■       The joint PDF of any two random variables X and Y may be defined as the partial derivative of the joint cumulative distribution function Fxy (x, y) with respect to the dummy variables x and y. Mathematically,
EQUATION
■       The relationship between probability and PDF over a certain interval is expressed as
EQUATION
■       In case, when the probability density functions fx(x) and fy(y) for any single random variable are obtained from joint PDF, then fx(x) and jy(y) are called marginal PDF or simply marginal densities.
■       Out of the two random variables, one variable may take a fixed value. In this case, the PDF is called conditional.
■       There are some other measures or numbers which give more useful and quick information about the random variable. Collectively these characteristic numbers or measures are known as statistical averages.
■       The mean or average of any random variable is expressed by the summation of the values of random variables X weighted by their probabilities. Mean value of a random variable is denoted by nit. Mean value is also known as expected value of random variable X.
PAGE NO. 89
EQUATION
In this equation    m = mean value of the random variable
= variance of the random variable.
■       Rayleigh Distribution is used for continuous random variables. It is produced from two Gaussain random variables. Let X and Y be independent Gaussian random variables having mean value zero, and variance . Mathematically,
and                                          EQUATION
■       Rayleigh Distribution is always used for modeling of statistics of signals transmitted through radio channels such as cellular radio.
■       Let there be a random experiment E having outcome l from the sample sapce S. This means that l  S. Thus every-time an experiment is conducted, the outcome l will be one of the sample point in sample space. If this outcome l is associated with time, then a function of l, and time t is formed i.e., X(l, t). Then the function X(l, t) is known as random process.
■       When the statistical averages are taken along the time; they are known as time averages. As an example, we may define time mean value of a sample function x(t) as
EQUATION
 
■       The autocorrelation function may be expressed using time averaging as
EQUATION
■       A random process X(t) is called stationary if its statistics are not affected by any shift in the time origin
■       The process may not be stationary in strict sense, still the mean and autocorrelation functions are independent of shift of time region. Such type of process is known as wide sense stationary process.
■       A random process is known as ergodic process if the time-averages are equal to ensemble averages. Hence for a ergodic process, we have
EQUATION
■       It may he noted that a Gaussian process has two main advantages:
(i) Firstly, the Gaussian process has several properties which make analytic results possible.
(ii) Secondly, the random process produced by physical phenomena are often such that a Gaussian model may appropriate.
■       A Gaussian process has the following important properties:
(i) If a Gaussian process X(t) is made to apply to a stable linear filter, then the random process Y(t) produced at the output of the filter is also Gaussian.
(ii) Let us consider the samples of a random process X(t1), X(t2), ….. X(tn) obtained by observing a random process X(t) at time t = t1, t = t2, …… t = tn.
■       Also, if a Gaussian Process is wide-sense stationary (WSS), then the process is also stationary in the strict sense.
■       The power spectral density of the sum of two uncorrelated WSS random processes is equal to the sum of their individual power spectral densities.

Leave a Reply

Your email address will not be published. Required fields are marked *