image

2. F(1,3)=P(0,1)+P(0,2)+P(0,3)+P(1,1)+P(1,2)+P(1,3)=124+112+112=112+16+16=1524=58image

Example 4.17: From the following table for bivariate distribution. Find

1. P(X≤1)

2. P(Y≤3)

XY123456
000132image232image232image332image
1116image116image18image18image18image18image
2132image132image164image164image0264image

Image

Solution:

1. P(X1)=P(X=0)+P(X=1)=(0+0+132+232+232+332)+(116+116+18+18+18)=832+1016=14+58=78image

2. P(Y3)=P(Y=1)+P(Y=2)+P(Y=3)=[0+116+132]+[0+116+132]+[132+18+164]=332+332+164=6+6+164=1364image

4.10 Conditional Probability Distribution

If P(X=xi, Y=yj) is the probability function of discrete random variable X and Y, then the conditional probability function of X given Y=yj is defined as

P(X=xi/Y=yj)=P(X=x1,Y=yj)P(Y=yi)

image

And the conditional probability function of Y, given X=xi is denoted by

P(Y=yi/X=xj)=P(X=x1,Y=yj)P(Y=xi)

image

The conditional probability function of X given Y, and the conditional probability function of Y given X are also denoted by f(x/y) and f(y/x), respectively.

The conditional probability distributions are univariate probability distribution.

4.11 Independent Random Variables

If P(X=x, Y=y)=f(x, y) is the joint pdf of discrete random variables X and Y and marginal pdfs f1(x) and f2(y) such that P(X=x, Y=y)=f1(x) f2(y).

Then X and Y are said to be independent random variables.

Example 4.18: The two-dimensional random variables (X, Y) has the joint density function

f(x,y)=x+2y27,x=0,1,2;y=0,1,2;

image

Find the conditional distribution of Y for X=x. Also find the conditional distribution of X given Y=1?

Solution: The joint probability distribution of X and Y in the tabular form is:

XY123
00127image227image
1227image327image427image
2427image527image627image

Image

We have P(X=x)=f(x).

The conditional probability distribution of Y for X=x is f(y/x)=f(x,y)f(x)image where f(x, y) is the joint probability distribution of X and Y. Hence we get

f(y=0)(x=0)=f(X=x,y)f(x)=f(0,0)f(x=0)=0f(y=1)(x=0)=f(0,1)f(x=0)=1/273/27=13(Sincef(x=0)=327)f(y=2)(x=0)=f(0,2)f(x=0)=2/273/27=23

image

We have

f(x=1)=227+327+427=927

image

Hence

f(y=0)(x=1)=f(0,1)f(x=1)=2/279/27=29f(y=1)(x=1)=f(1,1)f(x=1)=3/279/27=39f(y=2)(x=1)=f(2,1)f(x=1)=4/279/27=49

image

Since

f(x=2)=427+527+627=1527

image

We get

f(y=0)(x=2)=f(2,0)f(x=2)=4/2715/27=415f(y=1)(x=2)=f(2,1)f(x=2)=5/2715/27=515f(y=2)(x=2)=f(2,2)f(x=2)=6/2715/27=615

image

Therefore the conditional distribution of Y for X=x is

XY123
0013image23image
129image39image49image
2415image515image615image

Image

From the table, the conditional distribution of X given Y=1 is

xyf(x,y)f(y=1)image
019image
139image
2529image

Image

4.12 Joint Probability Function of Continuous Random Variables

Definition 4.14

If x and y are two continuous random variables, then the function f(x, y) given by

p(a1xb1,a2yb2)=a1b1a2b2f(x,y)dydx

image

is called the joint probability function of X and Y, if and only if it satisfies the following conditions:

1. f(x,y)0image, for <x<image, <y<image

2. f(x,y)dydx=1image

The joint probability function of X and Y is also called the joint pdf of the continuous random variables X and Y.

4.13 Joint Probability Distribution Function of Continuous Random Variables

Definition 4.15

If X and Y are continuous random variables, the function given by

f(x,y)=P(Xx,Yy)=yxf(s,t)dsdtfor<x<,<y<

image

where f(s, t) is the value of the joint pdf of X and Y at (s, t) is called the joint probability distribution function or joint distribution function of X and Y.

We have f(x,y)=2xyF(x,y)image

Hence the joint distribution function is obtained by integrating the joint probability function, i.e.,

f(x,y)=yxf(t1,t2)dt1dt2

image

4.14 Marginal Distribution Function

Let X and Y be continuous random variables. If f(x, y) is the joint distribution function of the random variables X and Y, then the marginal distribution function of X denoted by F(x) is given by

F(x)=x(f(x,y)dy)dx

image

And the marginal distribution function of Y denoted by F(Y) is given by

F(Y)=y(f(x,y)dx)dy

image

where f(x, y) is the joint probability function of X and Y.

4.14.1 Marginal Density Functions

Let X and Y be continuous random variables and f(x, y) be the value of their joint probability density at (x, y). Then the function

f1(x)=f(x,y)dxfor<t<

image

is called the marginal density function of X. Correspondingly, the function given by

f2(x)=f(x,y)dyfor<y<

image

is called the marginal density of Y.

Example 4.19: The joint probability of continuous random variables X and Y is given by

f(x,y)=8xy,0x1,0yx=0,otherwise

image

Find the marginal pdfs f(x) and f(y)?

Solution: We have

f(x)=f(x,y)dy(Bydefinition)=0x8xydy=8x|y22|x0=4x3,0x1

image

and

f(y)=f(x,y)dx=018xydx=8y|x22|01=4y,0yx

image

4.15 Conditional Probability Density Functions

Definition 4.16

If X and Y are continuous random variables and f(x, y) is the value of the joint density function of X and Y at (x, y) then the conditional density of Y given X=x is defined as

f(y/x)=yF(y/x)=f(x,y)f(x),f(x)0for<x<

image

And the conditional density of X given Y=y is defined as

f(x/y)=f(x,y)f(y),f(y)0

image

Example 4.20: The joint density function of continuous random variables X and Y is given by f(x, y)=2, 0<x<y<1. Find the marginal and conditional pdfs?

Solution: The marginal density function of X given y is

f(x)=f(x,y)dy=x12dy=[2y]x1=2(1x)

image

The marginal density function of Y given x is

f(y)=f(x,y)dx=012dx=[2x]01=2

image

Conditional pdf of X given y is

f(y/x)=f(x,y)f(x)=22(1x)=11x,0<x<1

image

Conditional pdf of Y given x is

f(x/y)=f(x,y)f(y)=22=1

image

Remarks

The conditional distribution of a random variable entering into a system is its distribution calculated under the condition that the other random variable has assumed a definite value.

The random variables X and Y are said to be mutually independent if the conditional probability distribution of one does not depend on the value of the other, i.e.,

f(x/y)f(x)orf(y/x)=f(y)

image

The above mentioned properties of one-dimensional (univariate) and two-dimensional (bivariate) random variables can also be established for multidimensional random variables.

Exercise 4.3

1. Find the constant k so that the function

f(x)=1k,axb=0,otherwise

image

is a density function. Also find cumulative distributive function (c.d.f) of the random variable X?

Ans: k=ba,f(x)=xabaimage

2. Distribution function of a random variable X is

f(x)=x,0<x<1=2x,1x2=0,x2

image


Compute the cumulative distribution function of X.

Ans: f(x)=xaba0<x<1=2xx221,1x2=1,x2image

3. A continuous random variable X that can assume any value between x=2 and x=5 has a density function given by f(x)=k(1+x). Find k, P(X<4)?

Ans: k=227;P(X<4)=1627image

4. Suppose that the duration in minutes of a long-distance telephone conversation follows an exponential density function.

f(x)=15ex/5,forx>0

image


Find the probability that duration of a conversation

a. Will exceed 5 minutes?

b. Will be less than 3 minutes?

Ans: (a) e−1 (b) 1e3/5image

5. If X is a continuous random variable with distribution

f(x)=kx,0x50,otherwise

image


Find

a. k

b. P(1<x<3)

c. P(2≤x≤4)

d. P(x≤3)

Ans: (a) k=225image; (b) 825image; (c) 1225image; (d) 925image

6. Let X be a continuous random variate with pdf

f(x)=ax,0x1=a,1x2=ax+3a,2x3=0,otherwise

image

a. Determine the constant a.

b. Compute P(X<1.5)

Ans: (a) a=12image; (b) 12image

7. The distribution function of a random variable X is given by

F(x)=1(1+x)ex,x0=0,forx<0

image


Find the corresponding density function of X?

Ans: f(x)=xex,x0=0,x<0image

8. Two random variables X and Y have the probability function

f(x,y)=Ae(2x+y),x,y0=0,otherwise

image

a. Evaluate A

b. Find the marginal pdfs.

Ans: (a) A=2; (b) f(x)=2e2x,x0,=0,otherwiseimage, f(y)=ey,y0=0,y<0image

9. If f(x,y)=2xy,0x1,0y1=0,elsewhereimage
Find

a. Marginal probability function?

b. Conditional probability function?

Ans: (a) f(x)=32x;f(y)=32yimage

   (b) f(y/x)=2xy32x,f(x/y)=2xy32yimage

10. Let X and Y have joint density function

f(x,y)=e(x+y),x>0,y>=0,otherwise

image


Find P(0<x<1/y=2)?

Ans: e1eimage

11. The joint density function of a bivariate distribution is given as

f(x,y)=x+y,0x1,0y1=0,elsewhere

image


Determine the marginal distributions of x and y.

Ans: x+12,0x1,y+12,0y1image

12. Let X and Y be the joint distributed with pdf

f(x,y)=14(1+xy),|x|<1,|y|<1=0,elsewhere

image


Show that X and Y are independent?

13. The joint probability function of two random variables X and Y is

f(x,y)=c(1+xy),0x6,0y5=0,elsewhere

image


Find

a. c

b. F(0.1, 0.5)

c. f(x, 3)

Ans: (a) 1255image; (b) 0.0006102;
   (c) f(x,3)=1255(1+3x),0x6=0,otherwiseimage

14. Let X and Y be two continuous random variables with joint pdf

f(x,y)=c(xy),0x2,xyx=0,elsewhere

image

a. Evaluate c

b. find f(x)

c. find f(y/x)

Ans: (a) c=18image; (b) x24,0<x<2image; (c) xy2x2,0<x<2,x<y<ximage

15. Joint distribution of X and Y is given by

f(x,y)=4xye(x2+y2),x0,y0

image


Show that X and Y are independent. Find the conditional density of X given Y=y?

Ans: 2xex2image

16. The joint distribution of two random variables X and Y is given by the following table:

XY234
10.060.150.09
20.140.350.21

Image


Determine the individual distributions of X and Y. Also verify that X and Y are stochastically independent.

Ans:

X=x12
f(x)0.30.7
Y=y234
f(y)0.20.50.3

Image

17. Determine the value of k for which the function given by

f(x,y)=kxy,forx=1,2,3,y=1,2,3

image

can serve as a joint probability distribution.

Ans: k=136image

18. Given the joint pdf

f(x,y)=35x(y+x),0x1,0y2=0,elsewhere

image

of two random variables X and Y. Find P{(X, Y)∈A} where A is the region

{(x,y):0<X<12,1<Y<2}

image

Ans: 1180image

19. Find the joint probability density of the two random variables X and Y whose joint distribution is given by

f(x,y)=(1ex)(1ey),x>0,y>0=0,otherwise

image


Also use the joint probability density to determine P(1<X<3, 1<Y<2).

Ans: f(x,y)=e(x+y),x>0,y>0=0,elsewhereimage

4.16 Mathematical Expectation and Moments

In this section we introduce special constants, which serve to give a quantitative description of random variables. The constants of particular importance are mathematical expectation, variance, and moments of various orders. We begin our study by defining mathematical expectation, which is the central value of a variable distribution.

Mathematical Expectation: We shall first give a definition of mathematical expectation for discrete random variable.

Definition 4.17

Let X be discrete random variable. Let x1, x2, …, xn denote possible values of X and let p1, p2, …, pn denote the corresponding probabilities. Then the mathematical expectation denoted by E(X) is defined as

E(X)=i=1nxipii=1,2,3,,n

image

If X is a continuous random variable having f(x) is its pdf, then the mathematical expectation of X is defined as

E(X)=xf(x)dx

image

Mathematical expectation is also called the expectation or mean of the probability distribution.

Remarks

It is possible to find mathematical expectation without computation.

Mathematical expectation does not exist for all random variables.

Expectation is also denoted by μ (or by x¯image) and is defined by xP(X=x)image (or by (X)=xf(x)image where f(x)=P(X=x)).

Example: Consider an experiment in which a die is thrown we have S={1, 2, 3, 4, 5, 6}. Let X be the number of points obtained in a single throw. Then the corresponding probabilities are 16,image 16,image 16,image 16,image 16image, and 16image.

The expected value of X is

E(X)=116+216+316+416+516+616216=72

image

4.16.1 Properties of Mathematical Expectation

Theorem 1

If X is a random variable and k is a real number, then

1. E(k)=k

2. E(K x)=k E(x)

3. E(x+k)=E(x)+k

Proof

1. If X is a discrete random variable, then

E(k)=i=1nkpi=ki=1npi=k1=k

image

If X is a continuous random variable, then

E(k)=kf(x)dx=kf(x)dx=k·1=k

image

Thus E(k)=k

2. Let X be a discrete random variable. Then

E(kx)=i=1nkxipi=ki=1nxipi=kE(x)

image

If X is a continuous random variable, then

E(kx)=kxf(x)dxk=xf(x)dx=kE(x)

image

Remark

In particular we have

E(x)=E(x)

image

3. If X is a discrete random variable then

E(X+k)=i=1n(k+xi)pi=i=1nxipi+i=1nkpi=i=1nxipi+ki=1npi=E(X)+k1=E(X)+k

image

If X is a continuous random variable then we have

E(x+k)=(x+k)f(x)dx=xf(x)dx+kf(x)dx=xf(x)dx+kf(x)dx=E(X)+k1=E(X)+k

image

Hence proved.

Remark

The mathematical expectation of X need not be finite. For example, consider

The probability of the discontinuous random variable k defined by

p(k)=e1k!,k=0,1,2,

image

The expectation E(k!) is

E(k!)=k=0k!e1k!=k=0e1=e1k=01whichisnotfinite.

image

Theorem 2

If X and Y are random variables, then

E(X+Y)=E(X)+E(Y)

Proof

Let X and Y be two discrete random variables.

Let x1, x2, …, xn be the values assumed by the random variable X. Let p1, p2, …, pn denote the corresponding probabilities.

Let y1, y2, …, yn be the values assumed by the random variable Y and p1,p2,,pnimage denote the corresponding probabilities. Also let P(X=xi, Y=yj)=pij

By definition, we have

E(X+Y)=i=1mj=1n(xi+y)pij=i=1mj=1nxipij+i=1mj=1nyjpij=i=1m[j=1npij]+j=1npij[i=1mpij]=i=1mxipi+j=1nyjpj=E(X)+E(Y) (4.1)

image (4.1)

Let X and Y be continuous random variables and let f(x, y) be the pdf of X and Y. Then

E(X+Y)=(x+y)f(x,y)dxdy=xf(x,y)dxdy+yf(x,y)dxdy=x[(x,y)dy]dx+y[f(x,y)dx]dy=xf1(x)dx+yf2(y)dy(f1(x),f2(y)aremarginaldensityfunctionsofXandY,respectively)=E(X)+E(Y) (4.2)

image (4.2)

From Eqs. (4.1) and (4.2), we conclude that

E(X+Y)=E(X)+E(Y)

image

The above theorem can be generalized and stated as follows:

Cor 1: If X, Y, Z, …, are random variables then?

E(X+Y+Z+)=E(X)+E(Y)+E(Z)+

image

Definition 4.18

Two jointly distributed random variables X and Y are statistically independent for each other if and only if the joint pdf equals the product of the two marginal pdfs,

i.e.,f(x,y)=f1(x)f2(y)

image

where f1(x) and f2(y) are marginal pdfs of X and Y, respectively.

Theorem 3

If X and Y are two independent random variables, then

E(XY)=E(X)E(Y)

image

Proof

Let X and Y be two independent random variables and let X, Y be discrete.

Let X assume the m values x1, x2, …, xm. Let p1, p2, …, pm denote the corresponding probabilities.

Let Y assume the n values y1, y2, …, yn and corresponding probabilities be denoted as p1,p2,,pnimage.

Then

E(XY)=i=1mj=1npipjxixj=i=1mxipi+j=1nyipj=E(X)E(Y)

image

When X and Y are continuous random variables that are independent, the theorem holds and is left as an exercise to the student.

Generalization: The above theorem can be extended to the case of several variables and stated as follows:

If X, Y, Z, … are independent random variables

E(XYZ)=E(X)E(Y)E(Z)

image

4.16.2 Variance

Definition 4.19

Let X be a discrete random variable having probability function P(X=x), then the variance of X is defined as variance of X=E[(Xμ)2] where μ is the mean of X.

Since μ=E(X) the variance of X can be written as

Variance of X=E[(XE(X))2]

Variance of X is denoted by Var[X] [or by V[X]] or σ2

If x1, x2, …, xn are the values assumed by X, and p1, p2, …, pn are the corresponding probabilities then

σ2=Var[X]=i=1n(xiμ)2pi

image

If X is continuous random variable then the variance of X defined as follows:

Definition 4.20

Let X be a continuous random variable, having pdf f(x) then

Var[X]=σ2=(xμ)2f(x)dx

image

Remarks

Variance gives an idea of how widely spread the values of random variables likely to be. If the value of variance is larger, then the observations are more scattered on average.

4.16.3 Properties of Variance

Theorem 4

If X is a random variable, then V[X]=E[x2]−[E(X)]2

Proof

Let X be discrete random variable.

By definition we have

Var[X]=V[X]E[(Xμ)2]=E[X22Xμ+μ2]=E[X2]2E[Xμ]+E[μ2]=E[X22μE[X]+μ2](sinceμ2isaconstant)=E[X2]2E(X)E(Y)+[E(X)]2(Sinceμ=E(X))ThusV[X]=E[x2][E(X)]2

image

The above property holds good for continuous variables also.

Theorem 5

If X is a random variable and k is a real number, then

1. V[k x]=k2 V[X]

2. V[x+k]=V[x]

Proof

1. V[kx]=E[(kx)2][E(kX)]2=E[k2x2][kE(X)]2=k2E[x2]k2[E(X)]2=k2[E[x2][E(X)]2]V[kx]=k2V[x]image

2. V(X+k)=E[(X+k2)][E(X+k)]2=E[X2+2kX+k2][E(X)+k]2=E[[X2]+2kE(X)+E[k2]]E(X)2+2kE(X)k2=E(X2)+k2[E(X)]2k2(sinceE(k2)=k2)=E(X2)[E(X)]2=V[x]image

Hence proved.

Remarks

The positive square root of variance is called the SD of X.

1. Standarddeviation=σ=V(X)=E(X2)[E(X)]2image

2. Since [X=E(X2)]0image, variance X0image

3. V[x]=0 if and only if X takes only one value with probability 1.

4. Variance of a constant is zero.

5. V[x] is invariant to the change of origin but variance is not invariant to the change of scale.

6. Let X, Y be two random variables and Y=Xkhimage then σy2=1h2σx2image where σx2=V[x]image and σy2=V[y]image.

Example 4.21: X and Y are two random variables such that YX. If E(X) and E(Y) exist, show that E[Y]≤E[X].

Solution: We have YX given,

i.e., YX≤0

Therefore we get E[YX]≤0 (applying expectation on both sides),

i.e., E[Y]−E[X]≤0

or E[Y]≤E[X].

Example 4.22: If X is a random variable and E[X] exists, then show that |E[X]|E|X|image?

Solution: Since X|X|image

We have

E[X]E|X| (4.3)

image (4.3)

Also we have

X|X|

image

Therefore

E[X]E|X|E[X]E|X| (4.4)

image (4.4)

From Eqs. (4.3) and (4.4), we get |E[X]|E|X|image

Example 4.23: If X and Y are independent random variables with density functions

f(x)=8x3forx>2andf(y)=2yfor0<y<1

image

Find E[XY]image?

Solution: Since X and Y are independent random variables

f(x,y)=f(x)f(y)

image

Therefore

E[XY]=201xyf(x,y)dxdy=201xy(8x3)(2y)dxdy=162(1x201y2dy)dx=1621x2[y33]01dx=163[x11]2=163[1x]2=163[012]=83

image

Example 4.24: If X takes the values xn=(−1)n 2n n−1 for n=1, 2, … with probabilities pn=2n, then show that E[X]=−log 2

Solution: Using the definition E[X]=xipiimage we get

[X]=i=12n(1)n2nn1=i=1(1)nn1+1213+14=[1+1213+14+]=log2

image

Example 4.25: Find variance for the following probability distribution:

x 8 12 16 20 24
P(X=x) 18image 16image 38image 14image 112image

Image

Solution: We have

E[X]=ixipi=818+1216+1638+2014+24112=1+2+6+5+2=16

image

Mean=E[X]=16

image

Variance=E(X2)[E(X)]2=ixi2pi[E(X)]2=8218+12216+16238+20214+242112162=8+24+96+100+48256=276256=20

image

Therefore V[X]=20image.

Example 4.26: Let the variable X have the distribution P(X=0)=P(X=2)=p, P(X=1)=1–2p, for 0p1/2image. For what values of p is the Var[X] maximum?

Solution: The probability distribution of X is

X:011P(X=x):p12pp

image

Thereforeexpectation=E(X)=0p+1(12p)+2p=0+12p+2p=1Variance=Var[X]=E(x2)[E(x)]2=02p+12(12p)+22p12=0+12p+4p1=2p

image

Clearly Var[X] is maximum when p=1/2image.

(since the maximum value Var[X] can take is 1 in 0p12image occurs is at p=12image)

Example 4.27: A coin is tossed until a head appears. What is the expectation of the number of tosses required?

Solution: Head can appear in the first toss, or in the second toss, or in the third toss, and so on.

The favorable cases are H,TH,TTH,TTTH,image

The probability in which head appears in the first toss=12image

The probability in which head appears in the second toss=(12)2image

The probability of getting head in the nth toss=(12)nimage

Let X denote the number of tosses required to get the first head. The probability distribution of X is

X=x 1 2 n
P(X=x) 12image (12)2image (12)nimage

Image

The expectation of number of tosses required is E(X)=ixipiimage

i.e.,

E(X)=112+2(12)2+3(12)3++n(12)n+=12[1+2+3(12)2+4(12)3++n(12)n1+]=12[112]2=12(12)2=222=2

image

Example 4.28:

1. What is the expected value of the number of points obtained in a single throw with an ordinary die. Also find the variance?

2. What is the mathematical expectation of the sum of points obtained on n dice?

Solution:

1. Let X be the random variable “the number of points” obtained X assumes the values 1, 2, 3, 4, 5, and 6, with probabilities 1/6image in each case.
Hence the expectation of X is

E(X)=116+216+316+416+516+616=16(1+2+3+4+5+6)=216=72

image

2. Let xi denote the number of points ith die. Then E(xi)=72image, i=1, 2, 3, …n, …
The sum of points on n dice is x1+x2+ent+ xn
Therefore

E(x1+x2++xn)=E(x1)+E(x2)++E(xn)=72+72+nterms=n(72)=72n

image


Hence the expectation of sum of points on n dice=72nimage.

Example 4.29: If X and Y are random variables having joint density function

f(x,y)=4xy,0x1,0y1=0,otherwise

image

Verify that

1. E(X+Y)=E(X)+E(Y)

2. E(XY)=E(X) E(Y)

Solution: We have

E(X)=xf(x,y)dxdy=x(4xy)dxdy=4x2ydxdy=4x=0y=0x2ydxdy=4y=0y(x=01x2dx)dy=4y=0y[x33]01dy=43y=01ydy=43[y22]01=46=23

image

Similarly

E(Y)=yf(x,f)dxdy=y(4xy)dxdy=4xy2dxdy=4x=01y=0xy2dxdy=23

image

Now

1. E(X+Y)=(x+y)f(x,y)dxdy=(x+y)(4xy)dxdy=40101(x2y+xy2)dxdy=4y=01(x=01(x2y+xy2)dx)dy=4y=01y[x33y+x22y2]01dy=4y=01[y3+y22]dy=4[y26+y36]01=4(16+16)=4(26)=43image
Therefore E(X+Y)=43=23+23=E(X)+E(Y)image
Also,

2. E(XY)=(xy)f(x,y)dxdy=(xy)(4xy)dxdy=40101x2y2dxdy=4y=01y2[x33]01dy=43y=01y2dy=43[y33]01=4313=49image
Thus E(XY)=49=2323=E(X)E(Y)image
Hence verified.

Theorem 6

(Cauchy–Schwartz inequality) If X, Y are random variables taking real values, then

[E[XY]]2E[X2]E[Y2]

image

Proof

Let t be a real variable.

Consider the expression (X+tY)2

(X+tY)2 is a nonnegative for all values of X and Y.

Hence E(X+tY)2≥0 for all i.

i.e., E(x2+2tXY+t2Y2)≥0 for all i

i.e., E(x2)+2tE(XY)+t2 E(Y2)≥0 for all i

t2 E(Y2)+2tE(XY)+E(x2) is of the form, at2+bt+c, a quadratic in t.

We have

a=E(Y2),b=2E(XY),c=E(x2)at2+bt+c0,impliesb24ac0

image

Therefore

4[E[XY]]2E[X2]E[Y2]0

image

or

[E[XY]]2E[X2]E[Y2]0

image

or

[E[XY]]2E[X2]E[Y2]

image

Hence proved.

Theorem 7

(Chebyshev’s inequality) If X is a random variable with mean μ and variance σ2 then

P(|xμ|k)σ2k2

image

or

P(|xμ|k)1σ2k2

image

where k is a real number.

Proof

Case 1: Let X be a discrete random variable and f(x) denote the probability function of X.

Then

Var[X]=σ2=E|(xμ)2|=(xμ)2f(x) (4.5)

image (4.5)

Right hand side of the sum given by Eq. (4.5) is nonnegative.

Hence

σ2|xμ|k(xμ)2f(x)|xμ|kk2f(x)

image

i.e.,

σ2k2|xμ|kf(x) (4.6)

image (4.6)

But

|xμ|kf(x)=P(|xμ|k)

image

Therefore from Eq. (4.6) we get

σ2k2P(|xμ|k)

image

i.e.,

σ2k2P(|xμ|k)

image

Thus we get

P(|xμ|k)σ2k2

image

Case 2: Let X be a continuous random variable and f(x) be the pdf of X then σ2=(xμ)2f(x)dximage.

Clearly the above integrand is positive. Therefore when the range of integration is reduced, the value of the integral decreases.

Thus we have σ2|xμ|k(xμ)2f(x)dx|xμ|kk2f(x)dximage

or

σ2k2|xμ|kf(x)dx

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset