While I was watching my old buddy Dan Henderson get beat up last Saturday I got into a beer- and scotch-fueled argument with my boxing coach (who also happens to be a criminal defense lawyer for people with names like Lucky who have institution mandated “halos of security” that you have to stay out of) about gambling. Here is the scenario. Two people. Person A is The Buyer and person B is The Bank.

The Buyer agrees to pay a certain sum (to be determined) to The Bank to play the following game.

The Bank tosses a fair coin until it comes up tails. The number of heads before the first tails comes up we define to be N.

If N=0 (the first coin toss comes up tails) The Bank pays The Buyer nothing. If N>0 The Bank pays The Buyer 2^(N-1) dollars.

For example if The Bank throws {heads, heads, heads, tails} then N=3 and The Buyer gets paid 2^(3-1)=4 dollars.

The argument was about what price The Bank should charge to The Buyer to play this game to ensure a “house edge”. What do you think?

### Like this:

Like Loading...

*Related*

would it be charge anything over 1 dollar for the house edge? Probability dictates that you would expect one tails for every 2 flips(p=.5 for any flip). That means N=1. 2^(1-1) = 1 would be the expected cost to the house so anything over 1 would give the advantage from a statistical perspective.

Sound wrong?

It seems like the answer is infinite. The expected cost is the sum of the cost of all possible outcomes, and I think the cost of {heads, heads, heads, tails}, for example, is 4 x the probability of {heads, heads, heads, tails} occurring. The probability of {heads, heads, heads, tails} occurring is 1/(2^4) so you get 0.25. The probability of {heads, heads, heads, heads, tails} occurring is 1/(2^5) with payout of 8 so you get 0.25 again. It looks like the answer is 0.25 x however many heads might come up (infinite).

Hi Dave

But think about what that implies in practice: would you be willing to put up everything you own (house, car, life savings, etc) to play the game right now with me? You have a 50% chance of losing everything on the first toss. Let’s say your net worth is a million dollars. To be in the money you need to have at least log_2(10^6) ~ 20 consecutive heads, which means that you only have a one in a million chance of being in the money.

What was your take then Geordie?

In reality I guess the house would charge something reasonable and take out an insurance policy. I just did a simulation and for ~1 million runs the banker has to charge $7-8 or so to make money. Not sure how many people would pay that or how much an insurance policy would cost in that case. I think a better option is for the banker to set a limit on the number of heads that could be flipped, then set the price. If the banker set the limit to 3 heads, he could charge $1 and make $0.25 per game.

Dave’s argument assumes that all of the terms in the expected cost expansion are all the same (0.25 in his example). Is this true?

Would you rather have a one in a billion chance to get a billion dollars or a one in five chance to have five dollars? Would you pay the same amount for both tickets (I know you should, but would you)? I think this is the crux of the pricing issue here.

Yeah, 0.25 for each term is the way I did it.

If the bank only paid 2^(N/2) dollars as a prize, then they’d break even charging 1+sqrt(2) dollars to play. Banks don’t like any non-disappearing chance of losing, so they’d probably just pay less to make it a disappearing chance.

General Comment

One would think there ought to be more activity on a site whose company so much is in the heat!

13 cents

RR: Yes one would think that… I’m to blame as I’ve been reticent in posting D-Wave related stuff. We’re working hard to solve our outstanding problems in scaling the systems as stated in our tech roadmap. We plan to publish many of these results but as you might imagine overcoming these issues is difficult and takes time. If you want to see some interim results, search on author=M.H.S. Amin at arxiv.org.

RR: I was going to say 13 cents (or > 12.5 cents) at first… but I don’t think it’s right.

This problem would make an excellent interview question.

There is no good price, as the risk to the banker is always too high due to the exponential nature of the payout. And if he charges too much to cover the 1:10^6 or 1:10^7 chance payout, nobody will play.

I ran 8 30,000 game simulations and $5 would be a good price however, with a max payout of $131,000 (N=18) and a profit of $0.45 per game. If the bank was to play lots of games, they’d be guaranteed to lose their shirts with that, but who’d play for $20, or $40.

This is why insurance is awesome. If they have to pay that 1:10^8 payout, they just charge more, and we *have* to play.

according to my calculations it’s 25 cents, although a simulation with a pseudo number generator (10 MB game count) was a little more then 4$.

We plan to publish many of these results but as you might imagine overcoming these issues is difficult and takes time. If you want to see some interim results, search on author=M.H.S. Amin at arxiv.org.

i search that and the latest i found with from 4/07 is they something later?

13 cents is wrong.

Even if The Buyer was limited to only 1 flip, 50% of those who play would win $1, making the house edge at just over 50 cents to play.

If The Buyer was limited to 2 flips, 50% of those who play would win nothing, 25% would win $1, and 25% would win $2, so the average payout would be 0*0.5 + 1*0.25 + 2*0.25 = 0.75.

3 flip limit: 50% win 0, 25% win $1, 12.5% win $2, 12.5% win $4.

For an arbitrary (n) flip limit the average payout is:

(n-1)*(1/4) + (1/2). Since n is unbounded the average payout is infinite.

Would you hire me?

Van: You are forgetting the extra no payout games that are played when 3 tails are flipped in a row. There is no way that 12.5% of the games pay out $4; it’s 12.5% of paying games, but still 50% of all games will pay out nothing.

And even though n is unbounded, you have to look at the risk. It’s too early for math, but the chance of a billion dollar payout, while real, is v. small. The chance for an “infinite” payout is infintessimal, zero for all intents and purposes, so the bank needs to ask what they are willing to risk.

Matthew: No games are played with three consecutive tails–the game ends when the first tail is flipped and The Buyer gets paid at that point.

I think this game illustrates the “gambler’s ruin” concept. Even though the risk of the bank going bankrupt is small, eventually if it keeps playing this game it will be bankrupted. If you look at some early hedge funds (like Long Term Capital Management and Eifuku) they drove their capital to zero, I think the fundamental reason is related to this game–if you are the bank you are betting your whole bankroll that you won’t get a lot of heads in a row. Eventually with that strategy you bankrupt yourself.

HI Joseph

Take a look at

http://arxiv.org/abs/0709.0528 (PRL)

http://arxiv.org/abs/0708.0384 (submitted to PRL)

http://arxiv.org/abs/cond-mat/0609332 (PRL)

also there are two more almost ready to go which I’ll link to when sent out (I think one of them is due today).

Van: The only way to know is to apply… on that note see my next post!

Geordie: I was more meaning that three consecutive tails are 3 consecutive games, each with no payout to the Player but a ticket price collected by the Banker. That that would happen has the same probability as one game with 3 heads.

if anybody want to play this game at $2.50 a flip with no flip limit i’ll play all day. as long has i see $25,000 to pay out.

so i guess i agree with Aleph Null

also thanks for the links Geordie

I think that if the bank changed the game slightly, say exponential growth until $10k, then linear increases of $5-10k, then $100k , then $1M until a max of $10M, it could find a good ticket price that would be profitable for the Banker, and still be cheap enough/interesting enough to be worth playing. The Bank’d need deep pockets (or insurance) to allow for a fluke win at first, but chances would be quite small of two major wins at first, and if they played long enough, they should still do well.

Damn, now I want to model this, and that means I gotta find all my old software (I am *not* doing this in Excel…)

II am envious of Dave’s life but he aint worth a million dollars yet.

Dave, I would like to point out that your analysis is looking at the mean payout. However, you need to consider the psychology of what people are willing to play the game. This is usually based on the mode or median payout. In this distribution potential payout is very large based on the mean. However, the modal and median payout is smaller.

Geordie, asking the question from the Bank’s perspective is a great variant of the St. Petersburg Game. I think you are right re Hedge funds. BTW, LTCM was adamant that their investments posed no risk. Indeed, in the long term they were right. However, they lacked the bankroll to weather both the Asian Financial Crisis of late 1997 and the Russian government’s bond default in August of 1998. “Two heads in a row.” They had not considered all risks. In general I agree. I recommend you read the opening story in the following paper. http://arxiv.org/abs/cond-mat/0305150 It relates victory to luck. Indeed, this analysis by the same author appears more on point. http://arxiv.org/abs/physics/0607109 (Ive read the first paper not the second.)

Geordie said “Would you rather have a one in a billion chance to get a billion dollars or a one in five chance to have five dollars? Would you pay the same amount for both tickets (I know you should, but would you)?”. But I disagree that you should. That view completely ignores two factors: risk management and the decreasing (some would say logarithmic) value of money with increasing wealth.

10 dollars, to someone who is completely broke and has nothing to eat, is gigantic. It might be the difference between starvation and survival. On the other hand, 10 dollars to a billionaire is lost in the noise and essentially meaningless.

Let’s say your net worth is 1 million dollars. Would any sane person really bet it all on a fair 50-50 bet, with a 50% chance of being utterly destitute, and a 50% chance of having 2 million dollars? How much happier would you be with 2 million rather than 1 million? Compare that with how much unhappier you would be if you were broke.

This is why, even if lotteries were completely fair (they aren’t), paid out 100% of what they take in (they don’t), and paid it all immediately (instead of making you wait 20 years to collect it all), it would *still* be stupid to play the lottery. A dollar in hand is in most cases much more valuable than a one-millionth chance at one million dollars. The only time they become close is if you are operating with a large cash reserve, say a billion dollars or so; in that case a linear approximation can be justified. The poorer you are, the more you should prefer the dollar in hand.

This problem is a lot like bankroll management in poker. No matter how well you play, if you go broke, you have to go back to real life to get more money to play again, even if you lost to a 1-in-1000 longshot and can usually make money at the game.

If every player were exactly equal in skill and bankroll, say 10 identical bots playing limit hold’em, over the very long term they would all have about equal winnings – and would all go bankrupt by bleeding out slowly to the house rake.

Winning at poker amounts to 1) beating the other players (playing against people who are worse at the game than you are), 2) beating the ante/blinds (not getting a very long run of bad cards), 3) beating the rake (winning faster than the house skims your winnings).

This requires fractal money management in your betting – for example, don’t put more than a tiny fraction of your bankroll on the table at a time so that you can recover easily from a bad beat (or string of them). Don’t put all your stack on one bet unless the odds of winning that hand are in your favor. Don’t play at all if you can’t beat the rake.

Yes, you could put all your money on the table and go all-in every time you catch a pair of aces, but eventually (assuming there are people dumb enough to call you) you will get drawn out against and lose everything, regardless of how many times you doubled up since starting. You would have a positive expectation on each bet (on average), but an expectation of zero in the long term. This eventually becomes an issue of when to stop playing (when you have made enough money) rather than whether to start playing.

In the question posed, the bank essentially has all its money on the table all the time, and can do no risk management beyond setting the price to play. Note that in real life when a large company takes a bad risk, it may simply refuse to pay, tying up the case in litigation until a typically far-smaller settlement is reached.

Seems like this is a special case of the Black Swan problem posed by Nassim Nicholas Taleb in his book of that name.

In real life, this is like the CEO of Bear Stearns taking too much risk and losing the firm. As long as he cashes out his $100 million in stock before the crash comes, he doesn’t really care what happens to the company.