« Sutter Health vs. Blue Shield: War of the Gargantuas | Main | JOTWELL: Leong on Rush on geographic diversity »

Friday, January 23, 2015

Game theory post 3 of N: some classic (one-shot, strategic form) games

There are a number of classic textbook games that are highly useful, primarily because if you know them well, you can often see real-world situations that have similar payoff structures; doing so, you have a pretty good initial guess at what will happen in those situations. Accordingly, I'll collect some here. (Behind the fold.)

Prisoners' Dilemma

Of course we have to start here. Everyone knows the PD, so I won't belabor it. If you want more on it, a long and deep discussion is here; a shorter summary is here. The key idea is that both players (its standard presentation is 2 players, but it can be extended to more) have strictly dominant strategies of defecting from cooperative play with one another, such that the only Nash equilibrium is mutual defection, but mutual defection is worse for the players than mutual cooperation.

The following image shows a simple example, in abstract form. (This example assumes the game is symmetrical, that is, the players each get the same cooperation/defection payoffs, however, a PD need not be stymmetrical. It's just easier to notate if it is. Here, and thereafter, player 1 will choose the row, and have payoff first; player 2 will choose the columns and have payoff listed second.*)


The PD is so popular because it's an excellent way of representing many situations where there is an incentive to defect from a mutually beneficial cooperative arrangement. For example, there have been many PD models in international relations literature, because an armed standoff can easily be understood as a PD. Imagine it: two countries are facing one another over a tense border, and can demilitarize or retain military readiness. If both demilitarize, they both do well, because they can spend their resources on less wasteful things. If one country demilitarizes and the other does not, the country that keeps its army can quickly conquer the other, and take all the resources. If neither country demilitarizes, they keep all this wasteful tension. Keeping the army is strictly dominant for both parties, that's the only equilibrium, both would be better off if both demilitarized, and you have a model of the Korean Peninsula.

For more legal implications, consider things like property rights (mutually beneficial cooperative relationships where all parties have an incentive to defect, if not adequately enforced), and even judicial corruption.

Also, people don't seem to agree where the apostrophe goes. I say there are two prisoners, who share the same dilemma, thus, prisoners'.

Battle of the Sexes

Some people like to divide the world into two general kinds of games: commitment games, and cooperation games. The PD is an example of a commitment game, in that the key issue is that players are unable to commit to a beneficial course of conduct. The "battle of the sexes" is a classic example of a coordination game.

The backstory that leads to the sexist name (sorry, but it's the language game theorists use, and will help with literature searches, etc.) is a classic piece of gender essentialism: a husband and a wife want to go out, but the husband prefers the football game, while the wife prefers the opera; they both, however, prefer to be together to being separate. How can we predict what they will decide?


The key takeaway here is that there are two equilibria: both at opera and both at football game. The resources of game theory have trouble explaining which one will be chosen; we often appeal to non-strategic ideas (like "focal points"---social background norms, basically) to make a prediction. Real-world situations that resemble a battle of the sexes are situations where the parties have to come to one of several possible mutually beneficial arrangements, as in contracting or the negotiated drafting of regulations, although such situations can often be better modeled using more complex games in which players have threats to execute against one another, different payoffs from deadlock, etc. in order to shift the surplus from cooperation their way.



This is another extremely important classic coordination game, which imagines a pair of hotheaded teenaged drivers playing, as the name suggests, a game of chicken on the roads: any player who swerves loses face, but if nobody swerves, they both die. I say this is a coordination game (contra the otherwise good Wikipedia page, which describes it as an anti-coordination game), for reasons that will hopefully become clear on comparing battle of the sexes and chicken to matching pennies, below. While the players do not want to be doing the same thing, they do want to be coordinating their strategies to get compatible behavioral pairs, just like in battle of the sexes. There are two pure equilibria, each in which one swerves and the other does not. There is also one mixed, which is problematic, because mixed equilibria (that is, ones in which the players randomize their choices) in the chicken game can lead with positive probability to worse outcomes, namely a crash. The problem of the chicken game, then, for sufficiently bad crash payoffs, can be seen as getting the players to choose one of the pure equilibria---that is, settling on who has to swerve.

Another really interesting thing about chicken is the way it gives players an incentive to find ways to precommit to a choice. If one of the hotheaded teenagers can, for example, yank out the steering wheel and throw it out the window, s/he forces the other player (if rational) to swerve. Precommitment was most famously dramatized in the doomsday machine of Dr. Strangelove, although the underlying game there is probably poorly described as chicken (it really makes more sense as a sequential punishment kind of commitment game, perhaps to be described in a later post).

In the legal context, we might model some kinds of settlement negotiations as a game of chicken, where if both parties refuse to back down, they burn up all their utility on expensive litigation.

Matching pennies


This really is an anti-coordination game. The story is that two players are gambling and choose a side of a penny to display; one player wants them to match while the other wants them to not match. This is also a variant of rock paper scissors, and of the soccer goalie problem noted previously. It's also the way that mixed strategy equilibria are traditionally introduced, for in this game there are no pure strategy equilibria, only a mixed strategy equilibrium. In the simplest case, the players weight each option equally, although when we change around payoffs and such this changes (how to figure out mixed strategy Nash equilibria is a topic for, perhaps, another post, although I'm not sure that it is sufficiently useful for the audience of these posts to be worth doing). Even this seemingly simple game, however, can be made endlessly complex by a motivated economist...

Stag Hunt

Today's last game is kind of a hybrid between the PD and battle of the sexes. The story is that two people are hunting, and they may either choose to hunt stag or hare; it takes them both to successfully hunt stag, while either acting alone may successfully hunt hare. But, of course, one stag is a lot more meat than two hare...


The key is that there are two pure Nash equilibria, in coordination game fashion, and while one is better for both players than the other, each player may get a consolation payoff by choosing the strategy corresponding to the worse equilibrium. So if players trust one another (cooperatively hunt stag), they do well; if they do not trust one another, they do less well (hunt hare). If one person trusts, and the other doesn't, then the poor fool who trusts gets a sucker's payoff, and the untrusting one still does a'ight. In lots of cases, stag hunt is a good alternative model to PD. Bryan Skyrms (who taught me all the evolutionary game theory I know) has written a really good book on the subject, although you can also get the heart for free in lecture form. This one is a really rich game, Skyrms's lecture will give you a taste of it, but this post is already too long, so I won't discuss it beyond raising its existence.

I don't think I'll be able to keep up this post-a-day pace for much longer, but more to come!

* These diagrams are all intuitive simplifications, and elide some edge cases where the diagrams given might not constitute the game in question. For the PD, for example, there is a more complex condition that might be added in repeated play, as described in Robert Axelrod & William D. Hamilton, The Evolution of Cooperation, 211 Science 1390 (1981). Finally, I am lousy at graphics---just have zero visual intuition---so my apologies if there are any errors or weird typos in the pictures...

Posted by Paul Gowder on January 23, 2015 at 08:55 AM in Games | Permalink


Another nice post. Thanks! I think the PD has been offered as a partial explanation for the existence of international organizations, although I can't think of any specific references off hand. The theory goes that states create international organizations (with enforcement regimes) and then commit to membership in those organizations for the purpose of preventing others (and themselves) from defecting so that everyone can reap the mutual benefits of cooperation.

Posted by: Stuart Ford | Jan 23, 2015 11:05:12 AM

Post a comment