Chapter 4

Random Variables

Abstract

Random variables are real numbers defined on sample spaces associated with experiments in which the outcomes are uncertain. Random variables may be discrete or continuous. The joint distributions, Binomial distribution, Poisson distribution, Normal distribution, uniform distribution, Gamma distribution, beta distribution, and Weibull distribution are discussed in a detailed manner in this chapter. Further this chapter also discusses about the mathematical expectations.

Keywords

Joint distributions; joint probability function; marginal density function; mathematical expectation; moments; moment generation function; skewness and kurtosis; discrete distributions and uniform distribution

4.1 Introduction

In this chapter we introduce random variables that are defined on sample spaces associated with experiments in which the outcomes are uncertain. The term random variable is one of the basic concepts of the probability theory. The values of a random variable are real numbers associated with the outcome of an experiment and the concept of the random variables is the one of the most significant in the theory of probability. Random variable is a variable whose value is subject to variations due to chance. Random variables vary from trial to trial as the experiment is repeated. It is also called stochastic variable and is defined as follows:

Definition 4.1

A random variable is numerical valued variable defined on a sample space of an experiment.

We shall denote random variable by the capital letters X, Y, Z, etc., and the corresponding small (lower case) letters are used to denote the numerical values taken by the random variable.

Example 4.1: Consider an experiment A in which two coins are tossed simultaneously. Let S be the sample space associated with the experiment. We have,

S={HH,HT,TH,TT}

image

If we define a random variable X, as the number of heads, then the values of X are 0, 1, and 2 corresponding to the outcomes.

TT(0head),HT(1head),TH(1head),HH(2heads).

image

We have the following table:

Sample pointTTHTTHHH
X=xi0112

Image

Example 4.2: The number of calls from subscribers at the telephone exchange during a definite time period is a random variable.

Example 4.3: An experiment is to fire four shots at a target. The random variable is the number of hits.

In the above experiment we have observed that each outcome is a simple event of the sample space S and the corresponding value is a real number.

Thus “A random variable is a rule that assigns one and only one numerical value to each simple event of an experiment” and we have the following definition:

Definition 4.2

Let S be a sample space of a random experiment and R denote the set of real numbers. Then a real-valued function X: SR is called a random variable.

The set of values which X takes is called the Spectrum of the random variable.

If S is a sample space of an experiment then we can define more than one random variable on S. The set RX={X(x):xS}image is called the range of X.

In general a real-valued function, whose domain is the sample space S of an experiment and whose range set a collection of n—tuples of real numbers, is called dimensional random variable.

If X(x) ≠ 0 for all XSimage then and 1X(x)|X(x)|image are also random variables on S.

If X and Y are random variables on the sample space of an experiment then X+Y, XY, and XY are also random variables on S.

For all real numbers a and b, a X+b Y is also a random variable on S.

The mathematical function describing the possible values of a random variable and their associated probabilities is known as a probability distribution.

Random variables can be discrete, i.e., taking any of a specified finite or countable list of values, and hence with a probability mass function (pmf) as probability distribution.

Random variables can be continuous, taking any numerical value in an interval or collection of intervals, and with a probability density function (pdf) describing the probability distribution, or a mixture of both types. The realizations of a random variable, i.e., the results of randomly choosing values according to the variable’s probability distribution, are called random variables.

4.2 Discrete Random Variable

In statistics we study variables such as heights of students, number of defective bolts, number of accidents on a road, number of male children in a family, number of printing mistakes in each page of a book, and so on. Some of these quantities can vary only by finite increments, by observable jumps in value. Such values are called discrete random variables. The sample space S of a discrete random variable contains either finite number of outcomes, which are countably infinite.

Definition 4.3

A random variable that can take only finite number of values are countably infinite number of values is called a discrete random variable.

Example 4.1: In an experiment of drawing four cards from a pack of cards, the random variable “The number of kings drawn” is a discrete random variable.

Example 4.2: Consider an experiment in which a coin is tossed four times. If X denotes the number of heads obtained then X is a discrete random variable.

Remarks

The sample space S for discrete random variable can be discrete, continuous, or it may contain both discrete and continuous points.

Discrete random variables are the random variables whose range is finite or continuously infinite.

4.3 Probability Distribution for a Discrete Random Variable

The simplest distribution for a discrete random variable X is an ordered series, which is a table whose top row contains all the values of the random variables and the bottom row contains the corresponding probabilities as shown in the following table:

X=xix1x2xi
P(X=xi)p1p2pi

Image

x1, x2, …, xn are the values of the random variable X and pi=P(X=Xi) are the corresponding probabilities where 1npi=1image. The above distribution is an ordered series. It can also be represented graphically as a frequency polygon or a histogram.

4.3.1 Probability Mass Function

Definition 4.4

Let X be a discrete random variable assuming the values x=x1, x=x2, …, x=xn corresponding to the various outcomes of a random experiment. If the probability of occurrence of X=xi, (I=1, 2, …, n) is P(X=xi)=pi

Such that

1. P(X=xi)=pi≥0 for all i (1in)image

2. i=1nP(X=xi)=i=1npi=1image

Then the function P(X) is called the probability function of the random variable X and the set

{(x1,P(X=x1)),(x2,P(X=x2)),,(xn,P(X=xn))}orsimply{(x,p(x1)),(x2,P(x2)),,(xn,P(xn))}

image

is called the probability distribution of the random variable X.

The probability function P(X=x) is also denoted by f(x) and is called the pmf of the discrete random variable X.

Remark

Some authors refer f(x) as the frequency function or a probability function.

We have

1. f(x)=P(X=x)

2. f(x)≥0

3. xf(x)=1image

Definition 4.5

(Finite equiprobable space) A finite probability distribution where each point X=xi has the same probability for all i, is called a finite equiprobable space or uniform space.

4.3.2 Distribution Function

The probability P(Xx)image is the probability of the event(Xx)image. It is a function of x. The function μrimage is denoted by F(x) and is called the cumulative probability distribution function of the random variable x.

Thus F(x)=P(Xx),(<x<)image

F(x) is often called the distribution function of X.

F(x) possesses the following properties:

1. F()=0image

2. F()=1image

3. 0F(x)1image

4. F(xi)F(x2)image if x1<x2image

5. P(x1<x<x2)=F(x2)F(xi)image

6. F(x+)=F(x)image

F(X) can also be defined as follows:

Definition 4.6

Let X be a discrete random variable, then the function

F(X)=P(Xx)=1xf(t),<x<

image

where f(t) is the value of the probability distribution is called distribution function of X.

4.3.3 Additional Properties of Distribution Function

Property 1: (Interval property) If X is a random variable and F(x) is the distribution function of X, then P(a<Xb)=F(b)F(a)image

Proof: The events a<Xbimage and Xaimage are disjoint and we have

P(a<xb)(xa)=P(xb)

image

Hence

P(a<xb)(xa)=P(xb)

image

or

P(a<xb)=(xb)P(xa)F(b)F(a)

image

Property 2: If F(x) is the distribution function of a random variable x, then

P(axb)=P(x=a)+F(b)F(a).

image

Proof: We have

P(axb)=P(x=a)+P(a<xb)P(x=a)+F(b)F(a)(byproperty(1)).

image

Property 3: If F(x) is the distribution function of a random variable X, then

P(a<x<b)=F(b)F(a)P(x=b).

image

Proof:

P(a<x<b)=P(a<xb)P(x=b)F(b)F(a)P(x=b)(byproperty(1)).

image

Property 4: P(ax<b)=F(b)F(a)P(x=b)+P(x=a)image

Proof: We have

P(ax<b)=P(a<x<b)+P(x=a)F(b)F(a)P(x=b)+P(x=a).

image

Property 5: (Monotone increasing property) If F(x) is the distribution of a random variable X and a<b, then F(a)≤F(b).

Since

P(a<Xb)=0

image

We have

F(b)F(a)0

image

or

F(b)F(a)

image

i.e.,

F(a)F(b),

image

hence proved.

If the random variable X takes the values x1, x2, …, xn with probabilities p1, p2, …, pn respectively, then we have

P[X<x1]=0P[Xx1]=P[X<x1]+P[X=x1]=p1

image

Therefore

P[X<x2]=p1P[Xx2]=P[X<x2]=P[X=x2]=p1+p2,P[Xxn]=p1+p2++pn

image

Example: Consider an experiment in which four coins are tossed. If we define a random variable as the number of heads obtained, then we have

P(X=0)=116,P(X=1)=416=14,P(X=2)=616=38P(X=3)=416=14=14,P(X=4)=116

image

Therefore we get

F(0)=P(X=0)=116F(1)=P(X=0)=P(X=1)=116+14=516F(2)=P(X=0)=P(X=1)+P(X=2)=116+14=38=1116

image

Hence the distribution function F(x) is given by

F(x)={0,forX<0116,for0X<1516,for1X<21116,for2X<31516,for3X<41,forX4

image

4.4 Mean and Variance of a Discrete Distribution

If X is a discrete random variable, then the mean and variance of the discrete distribution can be defined as follows:

Mean=μ=i=1nxiP(X=xi)=i=1mxipi

image

Variance=Var[x]=i=1n(xiμ)2P(X=xi)

image

or

σ2=i=1n(xiμ)2pi

image

where σ is the standard deviation (SD).

Since

pi=1,weget

image

Var[x]=i=1n(xiμ)2pi=i=1n(xi2+μ22xiμ)pi=i=1n(xi2)pi+i=1n(μ2)pii=1n(2xiμ)pi=i=1n(xi2)pi+μ2i=1npi2μi=1nxipi=i=1n(xi2)pi+μ22μμ=i=1n(xi2)piμ2

image

Thus

σ2=i=1n(xi2)piμ2

image

Example 4.4: Find the probability distribution of the number of blue balls drawn when 3 balls are drawn without replacement from a bag containing 4 blue and 6 red balls?

Solution: Number of balls in the bag=4 blue+6 red=10

Let X be the random variable “number of blue balls”

Then X can take values 0, 1, 2, and 3.

P(X=0)=P(noblueball)=P(3redballs)=C36C310=16P(X=1)=P(1blue+2redballs)=C14×C26C310=12P(X=2)=P(2blue+1redball)=C24×C16C310=310P(X=3)=P(3blueballs)=C34C310=130

image

The probability distribution is

x0123
P(X=x)16image12image310image130image

Image

Example 4.5: Two cards are drawn successively with replacement from a well shuffled pack of cards. Find the probability distribution of the number of kings that can be drawn?

Solution: Let X denote the random variable “number of kings.”

Since the number of cards drawn is 2, the random variable X can take values 0, 1, and 2

Number of kings in a pack=4

Number of cards in a pack=52

When one card is drawn from the pack, the probability of getting a king is

=C14C152=452=113

image

Probability of failure, i.e., the probability of not getting a king is

1113=1213

image

Hence we get

P(X=0)=P(noking)=12131213=144169P(X=1)=P(oneking)=1131213+1213113=24169P(X=2)=P(2kings)=113113+1169

image

The required probability distribution is

x012
P(X=x)144169image24169image1169image

Image

Example 4.6: A random variable X has the following probability distribution:

X=xi01234
P(X=xi)3k3kk2k6k

Image

Find

1. k

2. Mean

3. P(X>2)

Solution:

1. We have

pi=1

image


Therefore
3k+3k+3+2k+6k=1
or
15k=1
or

k=115

image

2. Meanμ=ixipi=(0)(3k)+(1)(3k)+(2)(k)+(3)(2k)+(4)(6k)=35k=35(115)=73image

3. P(X>2)=P(X>3)+P(X=4)=2k+6k=8k=8(115)=815image

Example 4.7: A random variable X has the following probability distribution:

X=xi−2−10123
P(X=xi)0.1k0.22k0.3k

Image

Find

1. k

2. Mean

3. Variance of X

Solution:

1. We have

pi=1

image


i.e., 0.1+k+0.2+2k+0.3+k=1
or 4k+0.6=1
or 4k=1−0.6=0.4
we get k=0.1

2. Mean=μ=ixipi=(2)(0.1)+(1)2k+(0)(0.2)+(1)(2k)+(2)(0.3)+(3)(k)=4k+0.4=4(0.1)+0.4=0.8image

3. Varianceσ2=i=1n(xi2)pi=μ2=(0.1)(2)2+(1)2k+(0.2)(0)2+(2k)(1)2+(0.3)(2)2+(k)(3)2(0.8)2=0.4+k+0+2k+1.2+9k0.64=12k+1.60.64=12(0.1)+1.60.64=2.16image

Exercise 4.1

1. Define

a. Random variable

b. Discrete random variable

c. Density function

d. Probability distribution

e. Spectrum

2. Obtain the probability distribution of the total number of heads in three tosses of a coin.

Ans:

x0123
P(x)18image38image38image18image

Image

3. A fair coin is tossed 4 times. Find the probability distribution of the number of heads?

Ans:

x01234
P(x)116image14image38image14image116image

Image

4. A random variable X has the following distribution:

X01234
P(X)λ

Image


Find

a. P(0<x<2)

b. P(x>2)

Ans: (a) 314image; (b) 47image

5. A bag contains 2 white, 3 red, and 4 blue balls. Two balls are drawn at random from the bag. If the random variable X denotes the “number of white balls” among the balls drawn, describe the probability distribution of X.

Ans:

x012
P(X=x)712image718image736image

Image

6. Three balls are drawn without replacement from a bag containing 5 white and 4 red balls. Find the probability distribution of the number of red balls drawn?

Ans:

x0123
P(x)542image1021image514image121image

Image

7. The probability distribution of a random variable X is given below:

X012
P(X)3 λ24 λ–10 λ25 λ−1

Image

where>0.
Find

a. λ

b. P(X≤1)

c. P(X>0)

Ans: (a) λ=13image; (b) P(X1)=13image; (c) P(X>0)=89image

8. A random variable X has the following probability distribution:

X1234
P(X)k2k3k4k

Image


Find

a. k

b. P(X<3)

c. P(X≥3)

d. Mean

Ans: (a) k=110image; (b) 310image; (c) 710image; (d) μ=3image

9. A random variable X has the following probability distribution:

X=xi−2−10123
P(X=xi)0.1k0.22k0.3k

Image


Find

a. k

b. Mean

c. Variance

Ans: (a) k=0; (b) μ=0.8; (c) σ2=2.8

10. A box contains 6 tickets. Two of the tickets carry a prize of Rs. 5/- each, the other four tickets carry a prize of Rs. 1/- each. If one ticket is drawn what is the mean value of the prize?

Ans: 73image

11. A die is tossed twice. Getting a “number greater than 4” is considered a success. For the probability distribution of the number successes. Show that the mean is 2/3 and the variance is 4/9.

12. Obtain the probability distribution of the number of sixes in two tosses of a cubical die.

Ans:

x012
P(X=x)2536image1036image136image

Image

4.5 Continuous Random Variable

Let X be a random variable. If X takes noncountable infinite number of values, then X is called a continuous random variable. If X is a continuous random variable then the range of X is an interval on real line.

Example: The length of an electric bulb, the detection range of a radar, etc., are examples of continuous random variables.

4.6 Probability Density Function

Definition 4.7

Let X be a continuous random variable. If for every X in the range of X, we assign a real number f(x) satisfying the conditions

1. f(x)0image, for <x<image

2. f(x)dx=1image

The function f(x) is called a pdf of X. It is also referred to as density function.

If f(x) is a pdf of X then we have

P(axb)=abf(x)dximage, for any real constants a and b with abimage.

The set of values obtained from abf(x)dximage for various possible intervals is called a continuous probability distribution for X.

4.7 Cumulative Distribution Function

Definition 4.8

If X is a continuous random variable having f(x) as its pdf, then the function given by

F(X)=P(Xx)=f(x)dx,for<x<

image

is called cumulative distribution function of X.

F(x) is also referred to as the distribution function or the cumulative distribution of X.

Remark

If the distribution function f(x) of a random variable X is continuous for every x such that F′(x) exists everywhere, except may be, at individual particular points then the random variable X is said to be continuous.

The pdf of a continuous random variable X is the derivative of the distribution function, i.e., f(x)=F′(x).

If a and b are real numbers with ab, then we have P(aX|X|b)=F(b)−F(a).

Definition 4.9

(Mixed random variable) Let X be a random variable and F(x) be the distribution function of X. If F(x) continuously increases on certain intervals but has discontinuities at particular points then the random variable is said to be mixed.

Examples of Continuous and Mixed Random Variables

Example 4.8: A random variable X has a Simpson distribution on the interval −c to c. Find the expression for the pdf?

Solution: The random variable X obeys “the law of an isosceles triangle.” The pdf is

f(x)=1c(1xc)for0<x<c=1c(1+xc)forc<x<0=0otherwisei.e.,x<corx>c

image

Example 4.9: A random variable X has Laplace transform defined by f(x)=acλ|x|image, where λ is a positive parameter. Find a?

Solution: Since the pdf is f(x)=aeλ|x|image,

aeλ|x|dx=1

image

We have

8218+12216+16238+20214+242112162

image

163[1x]2=163[012]=83

image

xn=(1)n2nn1xipiE[x]=i=12n(1)n2nn1=i=1(1)nn

image

a=λ2

image

4.8 Mean and Variance of a Continuous Random Variable

Let X be a continuous random variable. If f(x) is the pdf of X, then the mean and variance of X are given as

Arithmeticmean=μ=xf(x)dx

image

Variance=Var[X]=V[x]=σ2=x2f(x)dxμ2

image

4.8.1 Solved Examples

Example 4.10: Iff(x)={x6+k,0x30,elsewhereimage

is a pdf, find the value of k. Also find P(1≤x≤2)?

Solution: f(x) is a pdf. Therefore we have

f(x)dx=1

image

or

(x6+k)dx=1

image

or

(x6+k)dx+03(x6+k)dx+3(x6+k)dx=1

image

or

0+[x212+kx]13+0=1

image

or

912+3k=1or3k=134=14

image

or

k=112

image

Now

P(1x2)=12f(x)dx=12(x6+112)dx[x212+x12]12=(412+212)(112+112)=13+1616=13

image

Example 4.11: Given the cumulative distribution function

f(x)={0,,x>0x2,,0x11,,x>1

image

1. Find the pdf?

2. Find P(0.5<x0.75)image?

Solution:

1. Since f(x)=ddx(F(x))image
We get f(x)={0,x>02x,0x11,x>1image
or simply f(x)={2x,0x10,otherwiseimage

2. P(0.5<x0.75)=P(X0.75)P(X0.5)=F(0.75)F(0.5)=(0.75)2(0.5)2=0.56250.25=0.3125image

Example 4.12: Find k such that f(x) is a pdf of a continuous random variable X where f(x) is defined as follows:

f(x)={kxex,0<x<10,otherwise

image

Also find the mean.

Solution:

1. Since f(x) is a pdf we have f(x)dx=1image
i.e., 01kxexdx=1image
or k[xex(1)ex(1)2]=1image
or k[(e10)(e1e0)]=1image
or k(12e)=1image
or k=ee2image

2. Meanμ=xf(x)dx=01x[ee2xex]dx=ee201x2exdx=ee2[x2ex12xex(1)2+2ex(1)3]01=ee2[e12e1(e11)]=ee2[1e2e2e+2]=ee2[5e+2]=2e5e2image

Example 4.13: The distribution of a random variable given by

F(x)={1(1+x)ex,forx00,forx<0

image

Find the corresponding density function of random variable X?

Solution: We have

F(x)=ddx(F(x))=ddx(1(1+x)ex)=ddx(1exxex)=0(ex)[x(ex)+ex]=ex+xexex=xex

image

Hence f(x)={xex,x00,otherwiseimage is the required density function.

Example 4.14: Find the cumulative distribution function for the following probability function of a random variable X?

f(x)={xex,x00,otherwise

image

Solution: Since

f(x)=ddx(F(x))

image

We have

F(x)=xf(x)dx=0f(x)dx+0xf(x)dx=0+6x(xx2)dx=6[x22x33]0x=6[3x22x36]=3x22x3

image

Hence

F(x)=3x22x3,0x1

image

Example 4.15: For the following density function:

F(x)=ae|x|x

image

of a random variable X. Find 1. a, 2. Mean, 3. Variance

Solution:

1. f(x) is a pdf we have,
Therefore f(x)dx=1image
or ae|x|dx=1image
or 2a0e|x|dx=1(sincee|x|isanevenfunction.)image
or 2a|(01)|=1image
Hence a=12image

2. Mean=f(x)dximage
i.e.,μ=x(12e|x|)dx=12xe|x|dx=0(sincexe|x|isodd)image

3. Variance=x2f(x)dxμ2=12x2e|x|dxθ2=1220x2e|x|dxθ2(sincex2e|x|isodd)=0x2e|x|dxθ2=[x2(ex)2xex2ex]0=002(01)=2image

Exercise 4.2

1. A continuous random variable X has a pdf

f(x)=3x2,0<x1=0,otherwise.

image


Find a and b, such that P(xa)=P(x>a) and P(x>4)=0.05?

Ans: a=(12)1/3b=(1920)1/3image

2. A random variable X has the pdf

f(x)=2x,0<x<1=0,otherwise

image


Find

a. P(x<12)image

b. P(14<x<12)image

Ans: (a) 14image; (b) 316image

3. Find the cumulative distribution function for the following probability distribution function of a random variable x?

f(x)=x4ex/2,0<x=0,otherwise

image

Ans: f(x)=1ex/2x2ex/2,0<x<image

4. If the pdf of a random variable is given by

f(x)=k(1x2),0<x<1=0,otherwise

image


Find

a. k

b. The distribution function of the random variable.

Ans: (a) 32image; (b) 1

5. If X has the pdf

f(x)=ke3x,x>0=0,otherwise

image


Find k and P(0.5≤x≤1)

Ans: 1, 0.173

6. A continuous random variable X that can assume any value between x=2 and x=5 has a density function given by

f(x)=k(1+x)

image


Find P(x<4)?

Ans: 227,1627image

7. Find the pdf for the random variable whose distribution function is given by

F(x)=0,forx0=x,for0<x<1=1,forx1

image

Ans: f(x)=1,0<x<1=0,otherwiseimage

8. A continuous random variable X has the following pdf:

f(x)=12,1x0=14(2x),0<x<2=0,elsewhere

image


Obtain the distribution of X.

Ans: f(x)=x+12,1x0=18(4+4x+x2),0<x<2=1,ifx2image

9. A random process gives measurements x between 0 and 1 with a pdf

f(x)=12x321x2+10x,0x1=0,elsewhere

image


Find

a. P(X12)image

b. P(X>12)image

c. a number k such that P(Xk)=12image

Ans: (a) 916image; (b) 716image; (c) 12image

10. Let X be a continuous random variable with pdf

f(x)=ax,0x1=a,1x2=ax+3a,2x3=0,elsewhere

image

a. Find the value of a

b. Find P(X<1.5)

Ans: a=12,P(X<1.5)=12image

Note: If X is continuous random variable and f(x) is the pdf of X defined in the interval (a, b) then

a. The median M of the distribution is obtained by solving aMf(x)dx=12and=Mf(x)dx=12image, for M

b. The mode of the distribution is the value of x for which f(x)=0image and f(x)<0image (a<x<b) hold, i.e., f(x) is maximum.

c. Mean deviation about the mean is given by

Meandeviation=ab|xμ|f(x)dx

image

d. Harmonicmean=ab1xf(x)dximage

e. Geometricmean=log=G=ablogxf(x)dximage, where G is the Geometric mean.

4.9 Joint Distributions

In the preceding sections we have discussed one-dimensional random variable as a real value function defined over a sample space of an experiment. The distributions of one-dimensional random variables are known as univariate distributions. In this section we shall be concerned with bivariate distributions. If X and Y are random variables defined over a sample space of experiment, then (X, Y) is a two-dimensional random variable or a bivariate random variable. A system of two random variables X and Y can be geometrically interpreted as a random point (X, Y) on the xy-plane (see the figure given below)

image

If the number of possible values (X, Y) is finite, then (X, Y) is called a two-dimensional discrete random variable. If RX denotes the range space of X, and RY is the ranges space of Y, then the Cartesian product of RX and RY is the range space of (X, Y). If (X, Y) takes all the values on a region R, of xy-plane the n(X, Y) is called a two-dimensional continuous random variable.

If X and Y are discrete random variables the n P(X=x, Y=y) is the probability of intersection of the events X=x and Y=y. Similarly we can define the probability of continuous random variable. It is preferable to express the probability by the means of a function with the values f(x, y)=P(X=x, Y=y) for any pair of values (X, Y) within the range of the random variables X and Y.

4.9.1 Joint Probability Function

Definition 4.10

(Joint probability function) If X and Y are discrete random variables, n the function f(xi, yi)=P(X=xi, Y=yi)=Pij is called the joint probability function for the discrete random variables X and Y if and only if f(xi, yi) satisfies the following conditions:

1. f(xi, yi)=Pij≥0, for all i, j

2. iipij=1image

The joint probability functions of the discrete random variables is also known as the joint pmfs of X and Y.

4.9.2 Joint Probability Distribution of Discrete Random Variables

Definition 4.11

(Joint probability function) If X and Y are discrete random variables the n the set of triples {xi, yj, pij}, i=1, 2, 3, …, n, j=1, 2, …, m is called the joint probability distribution of X and Y.

The joint probability distribution can be represented in the form of a table as shown below:

XYy1y2.yj.ymP(xi)
x1p11p12p1jp1mp1
x2p21p22….p2jp2mp2
entent
xipi1pi2PijPimpi
entent
xnPn1pn2PnjPnmpn
P(yj)p.1p.2P.jP.m1

Image

4.9.3 Marginal Probability Function of a Discrete Random Variables

Definition 4.12

(Marginal probability function) If P(X=xi, Y=yi)=Pij is the joint probability distribution of two discrete random variables X and Y the marginal probability function of X is given by

P(X=xi)=pi.=pi1+pi2++Pij+=ipij=pi

image

The marginal probability function of X is also denoted by f(x). The set {xi, pi.}, is called the marginal distribution of X.

Similarly the marginal probability function of Y is given by

P(Y=yi)=p.j=ipij=p1j+p2j+

image

The marginal probability function of Y is also denoted by f(y). The set {yj, p.j}, is called the marginal distribution of Y.

4.9.4 Joint Distributive Function of Discrete Random Variables

Definition 4.13

If X and Y are discrete random variables the function given by

F(x,y)=P(Xx,Yy)=xx1yf(s,t)for<x<,<t<

image

where F(x, y) is the value of the joint probability distribution of X and Y at (x, y) is called the joint distribution function (Fig. 4.1).

image
Figure 4.1 Joint distribution function.

F(x, y) can be interpreted geometrically as the probability of a random point (X, Y) falling in a quadrant whose vertex is (x, y).

The joint distribution function F(x, y) processes the following properties:

1. F(,y)=F(x,)=0image

2. F(,)=0image

3. F(,)=1image

4. P(a1<Xb1,a2<Yb2)=F(b1,b2)+F(a1,b2)F(b1,a2)image

Example 4.16: For the following joint distribution of (X, Y). Find (1) F(1, 1), (2) F(1, 3)?

xy1234
0124image112image112image124image
1112image16image16image112image
2124image112image112image124image

Image

Solution:

1. We have

F(1,1)=P(X1,Y1)=P(0,1)+P(1,1)=124+112+1+324=18

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset