Let's play a game.....


mordorbund
 Share

Recommended Posts

I came across a puzzle I thought was interesting.

I have 2 cards, each has a 2 numbers on it (one on each side of the card). The first card has 0/1 (0 on one side and 1 on the other) and the other card has 1/2 (1 on one side and 2 on the other). I pair you with someone who's quite rational and logical (you can imagine @Vort or @zil or Vort's friend with the hawt wife). The rules go like this:

  • There is no table talk allowed
  • I select one card randomly and you each look at your side of the card
  • Each of you locks in either "pass" or "play"
  • The pass/play choices are revealed simultaneously
  • If either player passes, we shuffle the cards and start again
  • If both players play, the one with the higher number wins $200 from me. The one with the lower number pays me $100.
  • We can continuing playing until someone gets bored

What is your strategy if you see a 0 on your side of the card? a 2? what about a 1?

Link to comment
Share on other sites

I'll play this game all day long.  The only real unsurety in the game is whether my opponent will play when she has the 0.  But this is easily circumvented by choosing never to play when I receive the 0.

If both players agree to always play, then the expected value of winnings per round is $50. So in ten rounds, I can expect to walk away with $500.

If I choose to be conservative and only play when I have a 1 or 2, then the expected value of winnings per round is $100, but play may happen in only 75% of the rounds. So in ten rounds, I could expect to walk away with $750

I wouldn't want to be you in this game, however.  The expected winnings for you are -$100 every round.

Link to comment
Share on other sites

Guest Mores
47 minutes ago, mordorbund said:

I came across a puzzle I thought was interesting.

I have 2 cards, each has a 2 numbers on it (one on each side of the card). The first card has 0/1 (0 on one side and 1 on the other) and the other card has 1/2 (1 on one side and 2 on the other). I pair you with someone who's quite rational and logical (you can imagine @Vort or @zil or Vort's friend with the hawt wife). The rules go like this:

  • There is no table talk allowed
  • I select one card randomly and you each look at your side of the card
  • Each of you locks in either "pass" or "play"
  • The pass/play choices are revealed simultaneously
  • If either player passes, we shuffle the cards and start again
  • If both players play, the one with the higher number wins $200 from me. The one with the lower number pays me $100.
  • We can continuing playing until someone gets bored

What is your strategy if you see a 0 on your side of the card? a 2? what about a 1?

I'm not certain what is going on here.  I need a visual.  Are you the dealer?  And you hold the card between the two players so each of them see opposite sides of the same card?

If that is so, then both players ought to always play on every card.  Just look at the numbers.  I'm not seeing a downside.

Edited by Mores
Link to comment
Share on other sites

Guest Mores
48 minutes ago, MarginOfError said:

If I choose to be conservative and only play when I have a 1 or 2, then the expected value of winnings per round is $100, but play may happen in only 75% of the rounds. So in ten rounds, I could expect to walk away with $750

Mathematically correct.  But logically unsound.  If you are paired with someone who has similar math skills and training in probability, they will take the same tactic (only pass on a zero).  Thus the ratio of played : presented rounds is 50%.  So, of 10 shuffles, 5 cards are played by both parties.  Of those five, you will get an average of half of them in your favor.  So, you will receive only $250 (statistically speaking).

So, whether you're playing from a common pool of money or if you are separate, it is better to have both players play every time.

Link to comment
Share on other sites

You're correct, I didn't present the right result, but I wouldn't change my strategy.  The risk that comes with playing always is that if your opponent chooses to only play when they don't have a 0, then I my winnings get diluted toward 0.  My biggest concern, then, is that I do have an opponent that is numerically literate enough to know that occasionally refusing to play when receiving a 0 increases their winnings at the expense of mine. But now we're getting into game theory, and not probability theory.  

So my strategy is to play only when I have a 1 or a 2, resulting in $25 per round.  Assuming one minute per round to play, distribute winnings, and reshuffle, that amounts to about $1,500 per hour, or the equivalent of a $3,000,000 annual salary as a full time position.

Expected values given below.

 

image.png.f40dbcccadea98318c70bbd87892c91d.png

E[Winnngs; Both Play Always] = 0.25 * -100 + 0.25 * 200 + 0.25 * -100 + 0.25 * 200
                                                     = 0.25 * (-100 + 200 + -100 + 200)
                                                     = 0.25 * 200
                                                     = 50

E[Winnings; Me Play > 0, Opp. Play Always] = 0.25 * (0 + 200 - 100 + 200) 
                                                                             = 0.25 * 300
                                                                             = 75

E[Winnings; Both Play > 0] = 0.25 * (0 + 0 + -100 + 200)
                                               = 0.25 * 100
                                               = 25

E[Winnings; Me Play Always; Opp Play > 0] = 0.25 * (0 -100  - 100 + 200)
                                                                            = 0.25 * 0
                                                                            = 0

 

 

Edited by MarginOfError
Link to comment
Share on other sites

45 minutes ago, MarginOfError said:

The risk that comes with playing always is that if your opponent chooses to only play when they don't have a 0, then I my winnings get diluted toward 0.

I'm trying to reconcile this. What you say makes sense on a gut level, but the numbers are muddy for me. I'm obviously being too casual in my analysis, and I need more rigor. Maybe you can help guide my thinking.

Over a thousand trials, I will see right about 250 0s and 250 2s. These are sure losses/sure wins, if the other guy plays. The other 500 will be 1s, of which I will win 250 and lose 250.

image.png.fb479c1735b473774b26bb421c698cc9.png  

Grand total: -$25,000 + $25,000 + $50,000 = $50,000

What happens if we both always play or if we both play "selfishly" (ducking out on a 0) are pretty obvious cases. For now, I want to focus on the case where I always play and my opponent ducks out when he sees a 0.

When my opponent ducks out, then obviously I am always seeing a 1 or a 2. Since he's going to duck out right about 250 times, I will lose out on those 250 wins, with 125 of them being when I see a 2 and 125 of them being when I see a 1. So:

image.png.d73a0a2a15988f9434abe8b96a829062.png  

Grand total: -$25,000 + $0 + $25,000 = $0

Okay, got it. Never mind. Though if you have some handy insights into how to visualize such things immediately instead of having to graph it out, I'm all ears.

h763CECBB

Edited by Vort
Images for tables
Link to comment
Share on other sites

Guest Mores
26 minutes ago, MarginOfError said:

Expected values given below.

You seem to have an error in the final column.

Quote

But now we're getting into game theory, and not probability theory.  

While I thought I understood game theory, I'm not sure how that would change the laws of probability.  And game theory doesn't really matter since neither of you are "against" each other.  You're against the dealer.

Corrected income out of 10 rounds, 

Both play: $500
You play >0: $750
Opp play >0: $500
Both play >0: $0

Neither of you knows which route the other will take.  But the sure fire way for you to "lose" is both of you to play >0.  Given that, I'd take any of the other options.

The best option is for the two of you to work together.  One plays >0 and the other plays all the time.  Then split.

Edited by Mores
Link to comment
Share on other sites

Guest Mores

OK, so the real question then becomes: What can you do on the first round to signal who is going to play >0 and who will play always. Since no talking is allowed at the table, you can't openly collude.  But logic must still win out.

Actually, it isn't the first round. You both play until one of you sees a zero.  Whomever is the first one to see the zero will be the one who plays all the time.  The other plays only when >0.

Then you can split it.  Or if you don't, then the other person would win anyway because of the way the bets would turn out anyway.

Link to comment
Share on other sites

6 minutes ago, Mores said:

You seem to have an error in the final column.

While I thought I understood game theory, I'm not sure how that would change the laws of probability.  And game theory doesn't really matter since neither of you are "against" each other.  You're against the dealer.

Corrected income out of 10 rounds, 

Both play: $500
You play >0: $750
Opp play >0: $500
Both play >0: $0

Neither of you knows which route the other will take.  But the sure fire way for you to "lose" is both of you to play >0.  Given that, I'd take any of the other options.

The best option is for the two of you to work together.  One plays >0 and the other plays all the time.  Then split.

 

You're right, the last column should be -100, 0, -100, 200.

Even if I were to correct that error, the expected values wouldn't be changed.

And you'll need to show me the work for your expected values.  I'm certain you've done those wrong. (or you can show me where my math has gone wrong)

 

And game theory doesn't change the probability theory. It just mucks up the decision making process. Since I can't communicate with the opponent, and can't know if they are playing to their own benefit or to mutual benefit, the best single-point decision I can make is to only play when I have a positive value.  

If I want to be more creative, I can use a bayesian process to guide my belief that the opponent is playing to the mutual benefit.  If the process starts to veer way from mutual benefit, I can start to play more defensively and minimize my loss.

Link to comment
Share on other sites

17 minutes ago, Vort said:

I'm trying to reconcile this. What you say makes sense on a gut level, but the numbers are muddy for me. I'm obviously being too casual in my analysis, and I need more rigor. Maybe you can help guide my thinking.

Over a thousand trials, I will see right about 250 0s and 250 2s. These are sure losses/sure wins, if the other guy plays. The other 500 will be 1s, of which I will win 250 and lose 250.

| I see | Number of times I see it | Win-lose | Total profit |

| ------ | ------ | ------ | ------ |

| 0 | 250 | Always lose | 250 x (-$100) = -$25,000) |

| 1 | 500 | Win half, lose half | 250 x (-$100) + 250 x ($200) = -$25,000 + $50,000 = $25,000 |

| 2 | 250 | Always win |  | 250 x ($200) = $50,000) |

Overall total profit: -$25,000 + $25,000 + $50,000 = $50,000

What happens if we both always play or if we both play "selfishly" (ducking out on a 0) are pretty obvious cases. For now, I want to focus on the case where I always play and my opponent ducks out when he sees a 0.

When my opponent ducks out, then obviously I am always seeing a 1 or a 2. Since he's going to duck out right about 250 times, I will lose out on those 250 wins, with 125 of them being when I see a 2 and 125 of them being when I see a 1. So:

| I see | Number of times I see it | Win-lose | Total profit |

| ------ | ------ | ------ | ------ |

| 0 | 250 | Always lose | 250 x (-$100) = -$25,000) |

| 1 | 500 | Win half, lose half | 250 x (-$100) + (250 - 125) x ($200) = -$25,000 + $25,000 = $0 |

| 2 | 250 | Always win |  | (250 - 125) x ($200) = $25,000) |

Overall total profit: -$25,000 + $0 + $25,000 = $0

Okay, got it. Never mind. Though if you have some handy insights into how to visualize such things immediately instead of having to graph it out, I'm all ears.

h763CECBB

Nope.  Generally, these kinds of problems are either brute force the expectations, or simulate them.

I will say, though, that expectation can be a really useful thing.  When you can think about it in terms of expected winnings per round, it gets a lot easier to extend the thinking into long term results.  It's a lot harder to reason with it if you try to think of long term results first.

Edited by MarginOfError
Link to comment
Share on other sites

Guest Mores
47 minutes ago, MarginOfError said:

You're right, the last column should be -100, 0, -100, 200.

Even if I were to correct that error, the expected values wouldn't be changed.

That is correct.

Quote

And you'll need to show me the work for your expected values.  I'm certain you've done those wrong. (or you can show me where my math has gone wrong)

You are correct, I had some errors in my math as well.  I had a mix-up because I altered the order of options.

The correct outcome is to take your chart and multiply the subtotal by 2.5 (for 10 rounds).

Thus 

Both play: $500
You play >0: $750
Opp play >0: $0
Both play >0: $250

So, we see which one is the worst outcome.  You do NOT want to play always if your opponent plays >0.

If you want to introduce game theory, look, the question is whether you'll pool your money or not.  If pooled, your numbers look like this:

Both play: $1000    ======  Each of you gets $500
One plays>0: $750 ====== One gets $750; The other gets $0; Split, each gets $375
Both play>0:$500   ====== Each get $250

So, then it becomes a question of what happens when the first person gets a zero.  That person must decide whether they will take the chance at getting $0 with the other guy getting $750.

Then we realize that this becomes more a question of tactics than strategy.  in a 10 round game, it really is largely dependent upon randomness. But in a longer game (say 500 rounds) you can work it out additional probability options, like playing only on a 2 (5 more columns in your table).  That is your weapon against the guy who gets his first zero after you.

Edited by Mores
Link to comment
Share on other sites

Guest Mores

There is one more part of the rules that we haven't focused on:

Quote

We can continuing playing until someone gets bored

If you both play always or if you both play >0, then it can go on and on and on and continue making money.  It is only a question of how fast it is coming to you.

If one decides to break the deal and only play >0 while you play always, then at some point one person will quit.  This will limit the other person's winnings.

With that in mind, if @Vort and @zil are both rational and logical people, it is in both contestants' best interest to play always and not faint.

Edited by Mores
Link to comment
Share on other sites

44 minutes ago, mordorbund said:

Perhaps it's only half as good as you think.

 

 

 

Oh, yeah. That guy. What a weirdo.

EDIT: But to be fair, his wife is smokin' hot. I wasn't joking about that. Woo-woo! And I think she likes me...

Edited by Vort
Link to comment
Share on other sites

2 hours ago, MarginOfError said:

And game theory doesn't change the probability theory. It just mucks up the decision making process. Since I can't communicate with the opponent, and can't know if they are playing to their own benefit or to mutual benefit, the best single-point decision I can make is to only play when I have a positive value.  

If I want to be more creative, I can use a bayesian process to guide my belief that the opponent is playing to the mutual benefit.  If the process starts to veer way from mutual benefit, I can start to play more defensively and minimize my loss.

Thanks for considering the no collusion rule. Sure, if you can communicate with your partner you can essentially say, "Look, the best outcome for both of us is if we play every time - win or lose." Shakes hands on it and maybe emphasize "The first time you pass, the game's over. I'm outta here." But you can't. So I appreciate seeing your alternatives.

For those that want to play every time or only on non-zeros, have you considered that your partner doesn't know whether you're optimizing for personal or mutual benefit?

Link to comment
Share on other sites

Guest Mores
4 hours ago, mordorbund said:

For those that want to play every time or only on non-zeros, have you considered that your partner doesn't know whether you're optimizing for personal or mutual benefit?

You gave two conditions that are of note:

1) The other person with whom I am playing is rational and logical.  I took that to mean that they could figure out all the combinations of strategies just as well as I can or better.
2) The game goes on until one of us quits.

Given those conditions, we would both conclude that playing on all numbers brings the most benefit to both of us -- automatically (assuming we play sufficient rounds to allow the laws of probability to even things out.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share