Appendix F: Probability and Random Variables

F.1 Random Variable

A random variable (rv) is unknown and unpredictable beforehand, that is, it has a random value, but its value is known completely once it occurs. A rv may be continuous or discrete. For example, the noise voltage generated by an electronic amplifier is random and has a continuous amplitude. On the other hand, coin flipping with outcomes head (H) and tail (T) is discrete. One does not know the outcome before flipping a coin. For each experiment, a value is assigned for each of the possible outcomes in the experiment. For example, the rv may be assumed to be X(s) = 1 for the outcome s = H and −1 for s = T. However, once the coin is flipped, the outcome (H or T) and hence X(s) is known. In the sequel we denote the rv simply by X but not as X(s).

A rv is characterized by its probability density function (pdf) or cumulative distribution function (cdf), which are interrelated. The cdf, FX(x), of a rv X is defined by

which specifies the probability that the rv X is less than or equal to a real number x. The pdf of a rv X is defined as the derivative of the cdf:

(F.2)images

Conversely, the cdf is defined by the integral of the pdf:

Based on (F.1)(F.3), a rv has the following properties:

  1. 0 ≤ FX(x) ≤ 1 since images and images.
  2. The cdf of a continuous rv X is a non‐decreasing and smooth function of X. Therefore, images implies x2x1.
  3. images and the area under fX(x) is always equal to unity: images.
  4. When a rv X is discrete or mixed, its cdf is still a non‐decreasing function of X but contains discontinuities.

Figure F.1 shows the pdf and cdf for coin flipping, based on the assumption that the probability of head and tail are equally likely, that is P(H) = P(T) = 1/2. The pdf and the cdf may then be written as

(F.4)images
2 Schematics depicting pdf and cdf for coin flipping with P (H) = P (T) = 1/2 and X(s) = 1 for s = H and X(s) = −1 for s =T.

Figure F.1 Pdf and Cdf For Coin Flipping with P(H) = P(T) = 1/2 and X(s) = 1 For s = H and X(s) = −1 For s = T.

F.2 Statistical Averages of Random Variables

A rv is characterized by its moments. The n‐th moment of a rv X is defined as the expectation of Xn:

(F.5)images

where integration by parts is used to obtain the last expression. The mean (expected) value mX of a rv X is given by its first moment:

In some applications we encounter functions of rv’s. For example, the mean value of Y = g(X) may be determined as follows:

Central moments of X may be determined using Y = (X−mx)n:

(F.8)images

The variance of X is given by

When X is discrete or of a mixed type, the pdf contains impulses at the points of discontinuity of FX(x). In such cases, the discrete part of fX(x) may be expressed as

where the rv X is assumed to be discontinuous at N points, x1, x2,…, xN. For example, in case of coin flipping, two outcomes may be represented as x1 for head and x2 for tail. Then, P(X = x1) = p ≤ 1 shows the probability of head, while P(X = x2)= 1−p denotes the probability of tail. Mean value and the variance of a discrete rv is found by inserting (F.10) into (F.6) and (F.9), respectively:

When the events are equally likely, that is, images, (F.11) simplifies to

(F.12)images

The median is closely related to the mean value of a rv. In statistics and probability theory, the median is the value of the random number separating the higher half of a data sample from the lower half. The median of N samples can be found by arranging all the observations from lowest value to highest value and picking the middle one. If there is an even number of observations, then there is no single middle value; the median is then usually defined as the mean of the two middle values. The median coincides with the mean value if the rv has a symmetrical pdf.

F.2.1 Statistical Analysis of Multiple Random Variables

Now consider two rv’s X and Y, each of which may be continuous, discrete or mixed. The probability that (X,Y) take values in the rectangle shown in Figure F.2 is given by

The joint cdf of the rv’s X and Y may be obtained by inserting images and images into (F.13):

(F.14)images

where

(F.15)images
Graph depicting the region with a shaded rectangle in which (X, Y) take values.

Figure F.2 The Region Defined By (F.13).

Joint pdf and joint cdf are related to each other as follows:

(F.16)images

The marginal pdf’s are found as follows:

(F.17)images

Joint moments of two rv’s X and Y may be obtained from their joint pdf images:

(F.18)images

Joint central moments are defined as

(F.19)images

The covariance of X and Y is given by

(F.20)images

Correlation coefficient between X and Y is defined by

If the experiments result in mutually exclusive outcomes, then the probability of an outcome in one experiment is statistically independent of the outcome in any other experiment. Then, the joint pdf’s (cdf’s) may be written as the product of the pdf’s (cdf’s) corresponding to each outcome:

(F.22)images

Two rv’s are said to be uncorrelated with each other if the correlation coefficient is identically equal to zero:

Two rv’s X and Y are perfectly correlated with each other if Y = a X where a is a constant. This implies

(F.24)images

Hence, the correlation coefficient ρ varies between −1 and +1. Note that when X and Y are statistically independent, they are also uncorrelated. If X and Y are uncorrelated, then (F.23) holds but they are not necessarily statistically independent.

Two rv’s are said to be orthogonal when

(F.25)images

Hence, X and Y are orthogonal when they are uncorrelated, and mX and/or mY is equal to zero.

F.2.2 Conditional Probability

The conditional pdf fX|Y (x|y) of the rv X for a given deterministic value y of the rv Y has the following properties:

(F.26)images

The marginal pdf of X may be determined from its conditional pdf as follows:

Generalization of the above to more than two rv’s is straightforward.

F.3 Moment Generating Function (MGF)

MGF of a rv X, which is defined as

reduces to the Fourier transform of fX(x) for s = jf. Therefore, the MGF may be used to exploit the advantages of the Fourier analysis.

For example, when the pdf of the sum images of n independent rv’s is required, one first determines the MGF of the sum by multipling the MGF’s of the individual rv’s by using the linearity of (F.34):

where i.i.d. stands for independent and identically distributed. The next step would be to determine the resulting pdf by taking the inverse of the MGF thus found in (F.35). However, it may not always be easy and/or not necessary to take the inverse, since some desired performance parameters, such as bit error probability, can be obtained directly from the MGF.

MGF of Y = g(X) may be obtained as

(F.36)images

The moments of a rv X can be directly obtained from its MGF:

(F.37)images

As an alternative to the MGF, the characteristic function of a rv X is defined by

One may easily observe that (F.34) and (F.38) are the same if s = jv.

F.4 Functions of Random Variables

In some applications, one may need to determine the pdf of a rv Y defined as a function of another rv X, that is, Y = g(X). If the mapping Y = g(X) from X to Y is one‐to‐one, then the determination of fY(y) is straightforward. However, when the mapping is not one‐to‐one, fY(y) is determined by using all roots of the function Y = g(X). The mean value of g(X) is given by (F.7).

2 Graphs (fx over x and fy over y) linked by rightward arrows and a box labeled Y=X2, depicting the pdf’s at the input and the output of a square-law detector.

Figure F.3 The Pdf’s at the Input and the Output of a Square‐Law Detector.

Schematic depicting sum of two Rv’s, Z = X + Y with shaded area for cdf of Z below the curve z = x + y.

Figure F.4 Sum of Two Rv’s, Z = X + Y.

2 Schematics depicting the pdf of the Rv X defined by (F.50) with random variable X (left) and the pdf of X (right).

Figure F.5 The Pdf of the Rv X Defined By (F.50).

F.5 Multiple Functions of Multiple Rv’s

Let Z and W denote two functions of rv’s X and Y:

(F.54)images

If, for all z and w, z = g(x,y) and w = h(x,y), have a finite number of solutions {xi,yi}, and the determinant of the Jacobian matrix,

is nonzero, then the pdf fZ,W(w,z) is given by

2 Schematic graphs depicting transformation between rectangular and the polar coordinates. The angles of ɸ and dɸ are indicated.

Figure F.6 Transformation Between Rectangular And Polar Coordınates.

Schematic depicting Z = max(X,Y).

Figure F.7 Z = max(X,Y).

Schematic depicting Z = min(X,Y).

Figure F.8 Z = min(X,Y).

F.6 Ordered Statistics

Consider N rv’s Xk, k = 1,2,…, N, each with cdf F(x). These rv’s are ordered as follows:

(F.82)images

The cdf of X(k) is denoted by F(k)(x). The cdf of the largest rv:

The cdf of the smallest rv:

The cdf of the kth rv:

where the last expression is obtained by inserting a = 1−b = F(x) into (D.6). The binomial coefficient

denotes the number of combinations of k in n. One may easily observe that (F.85) reduces to (F.83) for k = N and to (F.84) for k = 1. The pdf of the kth rv is found as follows:

(F.87)images

Since last two terms cancel each other, the pdf of the kth rv reduces to

(F.88)images

The pdf of the largest rv (k = N):

(F.89)images

The pdf of the smallest rv (k = 1):

(F.90)images

The pdf’s f(N)(x) and f(1)(x) reduce to (F.72) and (F.75), respectively, for N = 2 when X and Y are i.i.d. Figure F.9 shows the pdf and the cdf of kth rv for N = 5.

2 Graphs depicting pdf and cdf of the ordered Rv’s for N = 5 and k = 1, 3, and 5.

Figure F.9 Pdf and Cdf of the Ordered Rv’s for N = 5 and k = 1,3 and 5. Exponential Distribution with Unity Mean, images, Is Assumed.

F.7 Probability Distribution Functions

Uniform distribution:

A continuous rv X is said to be uniformly distributed in [a,b] if it is equally likely in [a,b]:

(F.91)images

The cdf is (see Figure F.10)

(F.92)images

The mean and the variance of a uniformly distributed rv is given by

(F.93)images

The MGF is given by

(F.94)images
2 Graphs depicting pdf (left) and cdf (right) of the uniform distribution.

Figure F.10 Pdf and cdf of the uniform distribution.

Gaussian (normal) pdf

The pdf of Gaussian (normal) distribution is defined by

where images and images denote respectively the mean (expected) value and the variance of the rv X. The cdf is found to be

(F.96)images

The Gaussıan Q(x) function

(F.97)images

represents the area under the tail of a zero‐mean and unity variance Gaussian pdf (see (B.1)). Hence, it is a monotonically decreasing function of x.

The higher order moments of a Gaussian rv are given by

(F.98)images

where the binomial coefficient is defined by (F.86).

The MGF of X is

Log‐normal distribution

In many areas of telecommunications, the signal levels are measured in dB. Consequently, the mean and the variance of signals are also expressed in dB. Log‐normal distribution is a Gaussian distribution for the random signals expressed in dB. For example, in a shadowing environment, the received signal power level at a distance r from the transmitter may be written as (see (11.123))

where Pm(r) denotes the mean received power level in dB at distance r and χ in dB denotes the shadow fading with a zero dB mean and a standard deviation σ in dB. Then, the pdf of the received signal at a given distance r has a normal distribution:

As long as P(r), Pm(r) and σ are in dB, (F.102) obeys all the rules for a normal distribution given by (F.95)(F.99). Note that the above formula is valid for a given value of the distance r since (F.101) describes a random process in r (see Chapter 1, Section 1.3.2).

Exponential distribution

The pdf and the cdf of an exponentially distributed rv X are given by

(F.103)images

The mean and the standard deviation of X are

(F.104)images

Using (F.34), the MGF is found to be

(F.105)images

Chi‐square distribution

Central chi‐square distribution:

If X is a Gaussian rv with images, then the pdf of

(F.106)images

is given by (F.46) and is said to have a central chi‐square distribution:

The MGF corresponding to (F.107) is found from (F.34) as

(F.108)images

The statistics of the power sums of i.i.d. Gaussian rv’s with zero‐mean and variance images

is found using the MGF approach:

The pdf corresponding to (F.110) is called as the central chi‐square pdf with n degrees of freedom and is determined using the Fourier transforms in Appendix C:

The Gamma function is defined by (D.108):

The moments:

(F.113)images

The cdf corresponding to (F.111) is given by

In wireless communications, one often encounters the sum of the powers of m rv’s with central chi‐square distribution. The power P of a complex rv, images, is given by images, where Xr and Xi denote, respectively, the real and the imaginary parts of X. Therefore, the pdf and cdf of the sum are obtained by inserting m = n/2 into (F.111) and analytical evaluation of (F.114) using (D.49):

Here, γ(n, x) denotes the incomplete Gamma function (see (D.106) and (D.108)):

The pdf and the cdf given by (F.115) are shown in Figure F.11 for various values of n. The pdf and the cdf were observed to shift towards higher values of z as n increases, implying the increased likelihood of observing higher values of z.

2 Graphs of fz(z) over z depicting pdf and cdf of the chi-square distribution (σ2X = 1).

Figure F.11 Pdf and Cdf of the Chi‐Square Distribution (images).

Non‐central chi‐square distribution:

When the Gaussian rv X has non‐zero mean with images, Y = X2 has a non‐central chi‐square distribution with the pdf obtained by inserting a = 1 and b = 0 into (F.45):

(F.117)images

The MGF is found to be:

(F.118)images

The MGF of the power sums, images, as defined by (F.109), of n i.i.d. Gaussian rv’s with non‐zero means images, and identical variances images is found from (F.118) as follows:

where χ denotes the noncentrality parameter. The pdf of the non‐central chi‐square distribution with n degrees of freedom) is found using (F.119): [2]

where Iα(x) is defined by (D.95). The moments are given by

(F.121)images

When m = n/2 is an integer, the cdf may be expressed in terms of the Marcum’s Q function as follows:

The Marcum’s Q function is defined by (B.15):

The PDF of the envelope R of the power sums can be obtained from (F.120) via a variable transformation, z = r2:

The moments are given by

(F.125)images

where 1F1(a; b; z) denotes the confluent hypergeometric function defined by (D.88):

(F.126)images

The cdf of the envelope is determined from (F.122) using the variable transformation, z = r2:

(F.127)images

If m = n/2 is an integer

Gamma distribution

The Gamma distribution is similar to central chi‐square distribution and is characterized by the following pdf:

(F.129)images

where α does not have to be an integer. The cdf may be expressed in terms of the incomplete Gamma function (see (F.115)):

(F.130)images

where Γ(α) and γ(α, z) denote respectively the Gamma function and the incomplete Gamma function (see (F.112) and (F.116). The mean and the variance of the Gamma distribution are given by

(F.131)images

The MGF is found from (F.34) to be

(F.132)images

If α is a positive integer, then Gamma distribution reduces to Erlang distribution. For α = 1 it reduces to exponential distribution. If α = m for m = 1,2,… and images the Gamma distribution reduces to central chi‐square distribution (see (F.115)) [3].

Rayleigh distribution

Rayleigh distribution is frequently used to characterize signals propagating through multipath fading channels. It is a special case of the central chi‐square pdf with two degrees of freedom, Z = X12 + X22, where X1 and X2 are zero‐mean statistically independent Gaussian rv’s, each with variance σ2. Z is often used to represent the power of a complex Gaussian rv, X1 + jX2. The pdf and the cdf of Z are found by inserting m = 1 into (F.115):

(F.133)images

We define a new rv R, which denotes the envelope of this complex‐Gaussian rv:

(F.134)images

The envelope R is characterized by the Rayleigh pdf:

The moments of the envelope are given by

(F.136)images

The cdf of the envelope is found to be

The variation of the pdf and the cdf given by (F.135) and (F.137) are plotted in Figure F.12 for σ2 = 1.

Graph depicting pdf and cdf (curves) of the Rayleigh distribution for σ2 = 1.

Figure F.12 Pdf and Cdf of the Rayleigh Distribution For σ2 = 1.

Rice distribution

The Rician pdf characterizes the statistics of the envelope of a narrow‐band signal with non‐zero mean corrupted by additive narrowband Gaussian noise. Rician distribution is a special case of the non‐central chi‐square pdf with n = 2 degrees of freedom, that is, Z = X12 + X22. If X1 and X2 denote statistically independent Gaussian rv’s with E[X1] = m1, E[X2] = m2, var[X1] = var[X2] = σ2, then the pdf is a non‐central chi‐square pdf with two degrees of freedom (see (F.120) and (F.122)):

(F.140)images

The pdf of the envelope R is Rician and is given by (F.124) with n = 2:

(F.141)images

where In(x) denotes the modified Bessel function of the first kind of order n (see Figure D.2). Note that I0(0) = 1, In(0) = 0 for n > 0 and In(x) is a monotonically increasing function of x.

The moments of R are given by

(F.142)images

where the Rice factor K denotes the ratio of the signal power images to the noise power, 2σ2. In a fading environment, the Rice factor shows the ratio of the signal power received from the line‐of‐sight (LOS) to the received signal power due to diffused scattering.

The cdf of R is obtained by inserting m = 1 into (F.128):

(F.143)images

where the Marcum‐Q function is defined by (F.123) and (B.15).

Nakagami‐m distribution

Nakagami‐m distribution is a flexible pdf which is widely used to characterize signals propagating through multipath fading channels. The pdf and the cdf of the envelope of a signal with Nakagami‐m distribution are given by

where m denotes the so‐called fading figure (parameter). Figure F.13 shows the variation of the pdf and cdf of the Nakagami‐m distribution for various values of the fading parameter m. The Nakagami‐m distribution reduces to one‐sided Gaussian distribution for m = 1/2 and Rayleigh distribution for m = 1. In the limiting case as m → ∞, (F.144) reduces to a delta function, hence representing a deterministic signal:

(F.145)images
2 Graphs (fR and FR over r) depicting pdf and cdf of a Nakagami-m distributed Rv with P0 = 1 for various values of the fading figure m.

Figure F.13 Pdf and Cdf of a Nakagami‐m Distributed Rv with P0 = 1 For Various Values of the Fading Figure m. Note that the curves for m = 1 corresponds to Rayleigh fading.

The moments of the envelope are found to be

(F.146)images

The PDF of Y = R2 is obtained by a variable transformation:

The cdf of Y is found using (D.49):

(F.148)images

Comparing with (F.115), one may observe that the power of a Nakagami‐m distributed rv has a central chi‐square distribution. The MGF is given by

(F.149)images
2 Graphs depicting the pdf and cdf for γ =1. They each feature plots for m=1, L=1; m=1, L=2; m=2, L=1; m=2, L=2; and m=2, L=3.

Figure F.14 The Pdf (F.151) and Cdf (F.152) For images.

Poisson distribution

Let the rv k denote the number of events during a time interval images where t1 and t2 are arbitrary times with t2t1. The events may represent telephone calls, e‐mails, accidents, earthquakes, departure/arrival of airplanes, deposits/withdraws from an account, births/deaths, arrivals of customers in a bank/supermarket etc. The events are assumed to be independent. The average number of events per unit time is defined by the arrival rate λ in arrivals/s.

The probability of k events during a time interval τ is given by

For example, the probability that no customers arrive to a bank during τ = 1 minutes is given by images if one customer arrives on the average per 10 minutes, that is, λ = 0.1 arrivals/minute. Also note images.

The pdf of the number of events during τ is given by

The corresponding cdf is found, by integrating (F.154):

If the time intervals of any two events do not overlap, then the corresponding rv’s are independent. [3][4] The mean value and the variance of k in [0, t] is then given by

(F.156)images

Using (F.34), the MGF is found to be

(F.157)images
Schematic of a timeline featuring random inter-arrival time and the random number of arrivals as arrows.

Figure F.15 Random Inter‐Arrival Time and the Random Number of Arrivals.

2 Graphs depicting pdf of the class A distribution for σ2G = 1 and A = 0.5, 1. They each feature plots for Γ=0.01, Γ=0.1, and Γ=1.

Figure F.16 Pdf of the Class A Distribution For images and A = 0.5, 1. Note that images corresponds to Gaussian pdf.

Binomial distribution

The probability that k out of n events occur with a probability p and n−k events occur with a probability 1−p is given by

where 0 < p < 1. The binomial coefficient, defined by (F.86), denotes the number of combinations of k events within n. For example, in an experiment, we flip a coin n = 3 times where p denotes the probability of head (H) and 1−p is the probability of tail (T). Heads can occur k = 0,1,2 or 3 times in n = 3 trials. For k = 2, one has images combinations of heads, namely, HHT, HTH, THH.

The binomial pdf is defined by

Integrating (F.171), one gets the cdf of the binomial distribution

(F.172)images

where u(x) denotes the unit step function. The mean the variance are given by [3]

(F.173)images

The characteristic function is

(F.174)images

References

  1. [1] A. Papoulis and S. U. Pillai, Probability, Random Variables and Stochastic Processes (4th ed.), McGraw Hill: Boston, 2002.
  2. [2] J. G. Proakis, Digital Communications (3rd ed.), McGraw Hill: New York, 1995.
  3. [3] P. Z. Peebles, Jr., Probability, Random Variables, and Random Signal Principles (3rd ed.), McGraw Hill: New York, 1993.
  4. [4] P. Beckmann, Probability in Communication Engineering, Harcourt, Brace and World, Inc.: New York, 1967.
  5. [5] David Middleton, Statistical‐Physical Models of Electromagnetic Interference, IEEE Trans. Electromagnetic Compatibility, vol. 19, no. 3, pp. 106–127, August 1977.
  6. [6] Leslie A. Berry, Understanding Middleton’s Canonical Formula for Class A Noise, IEEE Trans. Electromagnetic Compatibility, vol. 23, no. 4, pp. 337–344, November 1981.
  7. [7] G. Pay and M. Şafak, Performance of DMT systems in dispersive channels with correlated impulsive noise, VTC 2001 Spring, 6–9 May 2001, vol. 1, pp. 697–701.
  8. [8] Y. Matsumoto and K. Wiklundh, A simple expression for bit error probability of convolutional codes under class‐A interference, IEEE EMC Conf., Kyoto, 22P1‐1, 2009. www.ieice.org/proceedings/EMC09/pdf/22P1‐1.pdf.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset