19

When Will the Last One Fail?

19.1 THE PROBLEM

All the stuff we use in our everyday lives eventually stops working. Eventually, we stop working. Who would deny that? To be specific but less personal, let’s consider apparently identical toasters, coming right off the production line one after the other. Suppose we give one of these toasters to each of N people so they can make toast for breakfast each morning. Each of the N toasters therefore receives the same daily use—but they all last different times. One will be the first to fail, and then sometime later a second toaster will fail, and so on until the penultimate one fails, and then, alas, the Nth and last surviving toaster loses its ability to get hot.

It is an observed fact—one that can be explained theoretically if some quite reasonable physical assumptions are made, which I won’t pursue here (see any good undergraduate book on probability theory)—that many different things (including toasters) fail in accordance with an exponential probability law. That is, if we write the lifetime of a toaster as T, then T is a random variable such that

Prob(T > t) = e−λt, t ≤ 0,

where λ is some positive constant. In particular, we then have the following two statements: Prob (T > 0) = 1 and limt → ∞ Prob (T > t) = 0, both of which make physical sense. The first statement says a toaster won’t fail before we first turn it on, and the second one says nothing lasts forever.

Now, what is λ? Obviously, it has the units of reciprocal time, since the exponent λt must be dimensionless, but that doesn’t tell us what λ is. Notice that

Prob(T > t) = 1 − Prob(Tt) = 1 − FT(t)

where FT(t) is the probability distribution function of T. Since we have

FT(t) = 1 − Prob(T > t) = 1 − e−λt,

then the probability density function of T is

Images

Now, the expected value of T, that is, the average lifetime of a toaster, is given by

Images

In other words, λ is the reciprocal of the average lifetime. If we agree to measure time in units of the average lifetime, then with no loss in generality we can take λ = 1. So, from now on I’ll write

FT(t) = 1 − et

and

fT(t) = et, t ≥ 0.

One of the curious things about how a collection of seemingly identical things fail is that it can take a very long time, relative to the average failure time for an individual thing, for the last failure to occur. This is related to what is called the memoryless property of an exponentially distributed random variable. Here’s what that means. Toasters, and any other thing that obeys an exponential law, actually seem not to fail because they gradually wear out but rather because of some sudden, out-of-the-blue event. That is, the probability a thing will survive for an addition time interval of at least b is independent of how long it has already been in use. Suppose it has been in use for a time interval of at least a; then we can express this last statement mathematically with the conditional probability (look back at Problem 17) equation

Prob(T > a + b | T > a) = Prob(T > b),

and this holds for any non-negative values for a and b.

To show that this is the case for an exponential probability law is a matter of simply calculating both sides of the conditional equation. So, on the right we have

Images

And on the left

Images

and the conditional equation is confirmed. It can be shown, in fact, that the conditional equation for the lack-of-memory property holds only (in the case of continuous random variables) for the exponential law.

Now, suppose that instead of toasters we imagine our things are computer chips, and that we have a vitally important security system that uses one of these chips in a central way. If the chip fails, the security system goes off-line. Thus, to improve reliability we’ve built N chips into the system, with initially N − 1 of them serving as backups. Thereafter, when the current in-use chip fails, one of the remaining backup chips is automatically switched in as a replacement. We do imagine that, even though only one chip is functional at a time, all are powered up at the same time (that is, the backup chips are “idle but hot”). This architecture motivates the first question of this problem:

(1) What’s the average time until the last chip fails?

But of course we don’t want to wait until the last chip fails before we plug in new backup chips to replace the failed ones; rather, when the penultimate (next-to-last) chip failure occurs we’ll plug in new chips to replace the failed ones. So, a second question is:

(2) What’s the average time until the penultimate chip failure occurs?

Calculate numerical values for both questions for N = 2, 3, 4, and 5. As a partial check on the numbers, for any given value of N the answer to question (2) should obviously be less than the answer to question (1).

19.2 THEORETICAL ANALYSES

Back in Problem 16 we derived the distribution functions for both the minimum and the maximum of N independent, identically distributed random variables; if U and V are the minimum and the maximum, respectively, of N random variables Ti, 1 ≤ iN, where Ti is the lifetime of the ith chip, then

FU(t) = 1 − [1 − FT(t)]N

and

Images

The minimum and the maximum of the NTi are, of course, the random variables for the first and the last failures, respectively. So, the densities of U and V are

Images

and

Images

Or, substituting our expressions for FT(t) and fT(t) for toasters (now chips),

fU(t) = Ne−(N − 1)t et = Ne−Nt

and

fV(t) = N[1 − et]N − 1 e−t.

To answer Question (1), it is V that interests us first.

The average time until the last chip failure is

Images

an integral that looks harder to do than it is. Here’s one way to evaluate it. The binomial theorem tells us that

Images

and so, with n = N − 1, a = 1, and b = −et, we have

Images

So,

Images

From integral tables for c, a constant, we have

Images

and so, with c = −(N − k),

Images

Thus,

Images

This is easy to evaluate by hand for N = 2, 3, 4, and 5, and the result is (remember, the unit of time is the average lifetime of an individual chip):

N

E(V)

2

3/2 = 1.5

3

11/6 = 1.833

4

25/12 = 2.083

5

137/60 = 2.283

For Question (2), let Z be the random variable that represents the lifetime of the penultimate chip survivor. The distribution function of Z is

Images

The second term on the right may need some explanation. The first factor is the probability that N − 1 of the chips have failed by time t, the second factor is the probability one chip has not failed by time t, and the third factor is present because there are N possibilities for which chip is the last chip surviving.

Simplifying,

Images

Differentiating gives us the density for Z:

Images

Before continuing with these general expressions for FZ(t) and fZ(t), notice for the special case of N = 2 that Z is equivalent to the minimum function, U. That is, when there are just two chips to begin with, then the penultimate chip that fails is the first chip to fail. For N = 2, our two general expressions for FZ(t) and fZ(t) reduce to

Images

and

fZ(t) = −2FT(t) fT(t) + 2fT(t).

Then, if you look back at the expressions for FU(t) and fU(t), for the special case of N = 2, you’ll see that they reduce to

Images

and

fU(t) = 2[1 − FT(t)] fT(t) = −2FT(t) fT(t) + 2fT(t),

which agrees with FZ(t) and fZ(t) for N = 2. This agreement is reassuring as, without such agreement, we’d know that we had made a mistake in our equations for Z.

Now, as we did before, substitution of the formulas for FT(t) and fT(t) into the general formula for fZ(t) gives

fZ(t) = (1 − N)N(1 − et)N − 1 et + N(N − 1) (1 − et)N − 2 et

or

fZ(t) = N(N − 1) (1 − et)N−2 e−2t.

So, the formal answer to Question (2) is

Images

Using the binomial theorem as before,

Images

So,

Images

But this last integral is the very same integral we got before, and so we immediately have

Images

This is just as easy to evaluate by hand as was E(V), and we quickly get:

N

E(Z)

2

1/2 = 0.5

3

5/6 = 0.833

4

13/12 = 1.083

5

77/60 = 1.283

As we physically expected, E(Z) < E(V) for a given N. But the really striking relation that our two tables show is that E(Z) is less than E(V) by (it seems) exactly 1, always, independent of N. Can you prove this? In view, however, of the earlier discussion of the memoryless property of exponential random variables, do these numerical results now seem obvious?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset