4

Three Gambling Problems Newton Would “Probably” Have Liked

4.1 THE PROBLEMS

To lay the groundwork for the first problem, suppose you give a fair die to each of a large number of people. Each person tosses his or her die, over and over, until a 6 appears. Once this has occurred for every person, each then reports on how many tosses he or she made. What do you think is the average number of tosses of a fair die to get the first 6? Since the probability of a 6 on a fair die is 1/6, most people immediately reply “six tosses,” whether or not they really understand why. In fact, the answer is easy to calculate, and six is indeed correct. Here’s how to do it (but note, this is not one of the three “Newton” problems we are going to solve in this puzzler, merely a warm-up for the ones we’ll really be interested in, in just a few moments).

When a die is tossed and a 6 appears, let’s call that a success and write it as S. If anything else appears we’ll call that a failure and write it as F. Mathematicians call sequences representing repeated trials with just two possible, independent outcomes on each trial—our S’s and F’s—Bernoulli trials, in honor of the Swiss mathematician James Bernoulli, mentioned in note 9 of the introduction. Now, let’s write Prob(S) = p and Prob(F) = 1 − p. We can write down all the possible sequences of S’s and F’s that represent the appearance of the first S as follows: S, FS, FFS, FFFS, and so on. That is, if n is the number of tosses until the first 6 appears (the first S), then we have Prob(n = 1) = p, Prob(n = 2) = (1 − p) p, Prob(n = 3) = (1 − p) 2p, Prob(n = 4) = (1 − p) 3p and, in general, Prob(n = k) = (1 − p)k–1p.

Now, what we want to calculate is the average value of n, formally written as

Images

We could directly evaluate this sum, but that’s not the approach I’ll use. Instead, I’m going to show you what might at first seem to be just a trick. But, as you’ll see when we get to the first “Newton” problem, the same trick will work on it, too, and as is well known in mathematics, any trick that you can use more than once is actually a method!

We start by observing the obvious: any sequence of S’s and F’s starts either with an S or with an F. If it is with an S (with probability p), then we have the average number of tosses until the first S as 1. If it is with an F (with probability 1 − p), then we have the average number of tosses until the first S as 1 + μ. So, weighting these two particular results for the average number of tosses until the first S with their probabilities, we have

μ = 1(p) + (1 + μ) (1 − p) = p + 1 + μ − p − μp = 1 + μ − μp,

or

0 = 1 − μp,

and so, at last (and with surprising suddenness!),

Images

which is 6 for a fair die, just as most people intuitively guess.

Now you are ready for the first Newton puzzle in this problem. What is the average number of tosses of a fair die until the first time two 6s consecutively appear? This is a more difficult problem than the warm-up one because there now appears to be no obvious pattern to the sequences of S’s and F’s that end for the first time in SS. (Try it and see!) If you think intuition might still work here and say 12 tosses (perhaps because 6 + 6 = 12) or 36 tosses (perhaps because 62 = 36)—well, neither guess is correct. The correct answer is larger than even the 36. What we need to do is some more analysis rather than mere guessing.

The title of this chapter says there are three gambling problems Newton would probably have liked. What are the other two? The second one is very easy to state, but maybe not so easy to get your head around. It would, I think, have given even the great Newton pause for thought. Given a coin with probability p for heads on a flip and probability 1 − p for tails, what is the probability that, if we flip the coin over and over, we’ll get the fourth head before the seventh tail? What is the numerical value of this probability if the coin is fair (p = 1/2)? What is the probability if the coin is “almost” fair (p = 0.45)?

Finally, the third Newton gambling problem is just as easy to state and at least as tricky to get a handle on. Three people—call them A, B, and C—take turns tossing a single fair die. A starts, followed by B, then C, then back to A, and so on. This continues until one of them tosses a 6. That player then drops out of the game, and the remaining two continue. Eventually one of them finally tosses a 6, and the game ends. What is the probability it is first A, then B, that first toss 6s? What is the probability it is first B, then A, that first toss 6s?

4.2 THEORETICAL ANALYSIS 1

For the first Newton problem, let’s write P(n) for the probability our die first shows two 6s in a row at the completion of n tosses. Obviously, P(1) = 0 and P(2) = p2. We can even do a couple more cases by hand simply by writing out all possible sequences of S and F of length n and observing which ones end with SS and contain no earlier consecutive S’s. For example, P(3) = (1 − p) p2 for FSS and P(4) = (1 − p)2 p2 + (1 − p) p3 for FFSS plus SFSS. For longer sequences, however, this enumeration approach becomes increasingly difficult to do. We need a different approach. Let’s try our trick again!

Now we begin by observing that any sequence of S’s and F’s starts with one of four possibilities: SS, FF, SF, or FS. If it starts with SS (with probability p2), then we have the average number of tosses until the first double S as 2. If it starts with either FF or SF (with probability (1 − p)2 + p(1 − p)), then we have the average number of tosses until the first double S as 2 + μ. And finally, if it starts with FS (with probability (1 − p)p), then with a following S we have the average number of tosses until the first double S as 3 (that is, with probability (1 − p) p2), or with a following F we have the average number of tosses until the first double S as 3 + μ (that is, with probability (1 − p)2 p). And so, weighting the individual averages with their probabilities,

μ = (2) p2 + (2 + μ) [(1 − p)2 + p(1 − p)] + (3) (1 − p) p2 + (3 + μ) [(1 − p)2 p].

If you go through the algebra you should arrive at

Images

or, at last,

Images

That is, for a fair die, μ = 42 tosses, a number I’m pretty sure nobody (or, at least, hardly anybody) would have guessed!

4.3 COMPUTER SIMULATION 1

The code ds.m (for double S) simulates the repeated tossing of a fair die until the first double 6 appears. Indeed, if we call a sequence of such tosses a game, then ds.m simulates 100,000 games, all the while keeping track of how many tosses occur in each game until the first double 6 appears. Finally, it averages those 100,000 numbers to estimate the answer to our question. When ds.m was run, it produced an estimate of 41.8865 tosses, which is pretty close to our theoretical result. Here’s how the code works.

The first command calculates the probability of tossing a success (a 6) with a fair die and calls it check. Then, to simulate a game (the current game number is the value of the main loop variable loop, which runs from 1 to 100,000), the variables toss and flip2 are first initialized. The variable toss is the number of tosses that the toss about to be performed will be in the current game, and so toss is initially set equal to 1. A toss is performed by setting result equal to a random value between 0 and 1 and, if result is less than check, we have an S (and flip1 is set equal to 1)—otherwise we have an F and flip1 is set equal to 0. Since the code only has to remember the outcomes of the two most recent tosses, flip2, which represents the outcome of the previous toss, is initially set equal to 0.

To simulate a game, a while loop operates as long as keeptossing is equal to 1 (and so keeptossing is initially set equal to 1). Every time a new toss is about to be made, the old value of flip1 is given to flip2, and then flip1 gets the new outcome. To see if a toss has been the second S in a row, the code asks if flip1 + flip2 = 2; if so, keeptossing is set to zero, which then causes the while loop to terminate. The vector person stores the present value of toss: person(j) is the number of tosses in the jth game until the first double S. If flip1 + flip2 ≠ 2, then the code tosses again: toss is incremented by 1 and flip2 “remembers” the value of flip1. Then, the while loop operates again. When all 100,000 games are finally completed, the last command of ds.m sums the 100,000 values in the person vector and prints their average value.

ds.m

check=1/6;

for loop=1:100000

toss=1;flip2=0;

keeptossing=1;

while keeptossing==1

result=rand;

if result<check

flip1=1;

else

flip1=0;

end

if flip2+flip1==2

keeptossing=0;

person(loop)=toss;

else

toss=toss+1;

flip2=flip1;

end

end

end

sum(person)/100000

4.4 THEORETICAL ANALYSIS 2

What makes the second of our two problems hard is that it is not at all obvious (at first, anyway) where to start. If we make a couple of simple observations, however, and put them together, the solution is surprisingly easy. First, notice that the event of interest will occur by the end of the tenth flip. It is certain to happen either when we get the fourth head on that flip after six earlier flips gave tails or we get the seventh tail. The event could, of course, occur in fewer flips, perhaps as soon as on the fourth flip (with four straight heads). This gives us a clue as to where to start our analysis.

What we’ll do is calculate the probabilities that we get the fourth H on the tenth flip, that we get the fourth H on the ninth flip, that we get the fourth H on the eighth flip, and so on, down to getting the fourth H on the fourth flip. Our final answer will be the sum of all these probabilities. To do this it will be very helpful to have a systematic way of writing down these individual probabilities. Here’s how to do that. The probability of getting exactly four H’s in exactly n flips means that we get the fourth H on the nth flip and that there were three earlier H’s somewhere in the previous n1 flips. That can happen in Images ways, with each way having probability p3(1 − p)(n − 1) − 3 = p3(1 − p)n − 4, because if there were three H’s in n − 1 flips, then there must have been (n − 1) − 3 = n − 4 T’s in those n − 1 flips. The final, nth flip gives us the fourth H (with probability p), and so the probability of exactly four H’s in exactly n flips is Images. The answer to our question is then

Images

If p = 1/2, then this all reduces to 848/1,024 = 0.828125. If p = 0.45, the result is that the probability of getting four heads before the seventh tail is 0.733962.

4.5 COMPUTER SIMULATION 2

The code before.m simulates, ten million times, our coin-flipping problem. Here’s how it works. After setting the value of p, the variable total is initialized to zero. At the end of ten million simulations of our random process, total will be the number of those simulations in which four heads occurs before seven tails do. Each simulation begins with H (the number of heads, so far) and T (the number of tails, so far) set to zero. Then, a while loop executes until either H reaches 4 or T reaches 7, with either condition terminating the loop. If it was the H = 4 condition, then total is incremented by one. Once all the simulations are completed, the last line of the code produces an estimate for our probability.

before.m

p=0.45;total=0;

for loop=1:10000000

H=0;T=0;

while H<4&&T<7

if rand<p

H=H+1;

else

T=T+1;

end

end

if H==4

total=total+1;

end

end

total/10000000

When run, before.m produced estimates of 0.827948 for p = 1/2 and 0.734308 for p = 0.45; both estimates are close to the theoretical values we calculated earlier.

4.6 THEORETICAL ANALYSIS 3

Let PA be the probability that A is the first to toss a 6 as A, B, and C take turns tossing a fair die. Then either A tosses his 6 on his first try (with probability 1/6) or he and the other two players do not toss a 6 (with probability (5/6)3), and we are back to our original starting condition. So,

Images

which is easily solved to give

Images

With this probability A leaves the game, and B and C continue. With exactly the same sort of argument, if PB is the probability B is the first between B and C to toss a 6, we then have

Images

which is easily solved to give

Images

Thus, the probability it is A, and then B, to first toss 6s is

Images

For the reverse of the above, that is, for the case it is B, then A, that are the first to toss 6s, let’s take a different approach. Notice that the sequence of play is described by

ABCABCABCABCA …,

and so we see that

Images

The geometric series in the brackets on the right has the sum

Images

and so

Images

With this probability, B leaves the game, and C and A continue, with C having the next toss. The alternating sequence of play is now described by

CACACAC …,

and so we see that

Images

Thus, the probability that B, and then A, first toss 6s is

Images

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset