The study of probability distributions of a random variable is essentially the study of some numerical characteristics associated with them. These so-called parameters of the distribution play a key role in mathematical statistics. In Section 3.2 we introduce some of these parameters, namely, moments and order parameters, and investigate their properties. In Section 3.3 the idea of generating functions is introduced. In particular, we study probability generating functions, moment generating functions, and characteristic functions. Section 3.4 deals with some moment inequalities.
In this section we investigate some numerical characteristics, called parameters, associated with the distribution of an RV X. These parameters are (a) moments and their functions and (b) order parameters. We will concentrate mainly on moments and their properties.
Let X be a random variable of the discrete type with probability mass function pk = P{X = xk}, k = 1, 2, …. If
we say that the expected value (or the mean or the mathematical expectation) of X exists and write
Note that the series may converge but the series may not. In that case we say that EX does not exist.
Show that E|X|α < ∞ for α< k. Find the quantile of order p for the RV X.
Show that the moment of order n exists if and only if n < β. Let β > 2 Find the mean and the variance of the distribution.
show that moments of all order exist. Find the mean and the variance of X.
and
where 0 ≤ p ≤ 1, q = 1 − p.
Show that and
Here α3 is known as the coefficient ofskewness and is sometimes used as a measure of asymmetry, and α4 is known as kurtosis and is used to measure the peakedness ("flatness of the top") of a distribution. Compute α3 and α4 for the PMFs of Problems 8 and 9.
In this section we consider some functions that generate probabilities or moments of an RV. The simplest type of generating function in probability theory is the one associated with integer-valued RVs. Let X be an RV, and let.
with .
does the MGF exist?
Set P{X > j} = qj,j = 0, 1, 2, …. Clearly qj= pj+pj+2+…, j ≥ 0. Write Then the series for Q(s) converges in |s| < 1. Show that
where P(s) is the PGF of X. Find the mean and the variance of X(when they exist) in terms of Q and its derivatives.
where and find the PGF and the MGF in terms of f.
show that the MGF exists and equals
where P is the PGF of X.
for any fixed s, 0 < s < t0, and for each integer n ≥ 1. Expanding etx in a power series, show that, for ,
(Since a power series can be differentiated term by term within the interval of convergence, it follows that for |t| < s,
for each integer k ≥ 1.) (Roy, LePage, and Moore [95])
Show that X must be degenerate at n.
[Hint: Prove and use the fact that if EXk < ∞ for all k, then
Write P(s) as
Let
be the probability generating function of p(n, k). Show that
(Pn is the generating function of Kendall's τ-statistic.)
with u0(0) = 1, u0(k) = 0 otherwise, and un(k) = 0 for k < 0. Let be the generating function of {un}. Show that
If pn(k) = un(k)/2n, find {pn(k)} for n = 2, 3, 4. (Pn is the generating function of one-sample Wilcoxon test statistic.)
In this section we derive some inequalities for moments of an RV. The main result of this section is Theorem 1 (and its corollary), which gives a bound for tail probability in terms of some moment of the random variable.
where λ ≥ 0 is an integer, show that
In other words, show that bound (7) is better than bound (3) if and worse if 1 ≤ . Construct an example to show that the last inequalities cannot be improved.
.
where .