I'm trying to learn about the art of game design (that's why I'm here). I have a background in economics and game theory. The games I have designed to date have been "games" for the sake of economics/game theory experiments, not for entertainment. Anyway, on this thread, pelle pointed me toward the book *Rules of Play* by Salen and Zimmerman. I was able to get a copy from the library right away, and I'm checking it out.

I don't intend this to be a full review by any means. This is a chatty, uncareful discussion of their section on game theory (starting page 232 in my book). There are mistakes, and I'm trying to figure out if those mistakes make it more harmful than helpful to a game designer, or make it an appropriate simplification.

The section got my hackles up right away by saying, "Although it caused quite a sensation when it was introduced, the promises of game theory were never quite fulfilled, and it has largely fallen out of favor as a methodology within economics." As I mentioned in that other thread, this is simply not true. Game theory is hugely important in economics. This might not matter to a game designer, but it was a mistake which made me worry.

The sections on Decision Trees and Strategies are OK, although one could take the wrong message from some of it. First, there's an implication (which I think is just a bit of mis-phrasing, not a serious error) that a strategy is something selected from reasonable/non-stupid play on a game tree. Since it's trying to use the technical idea of a strategy, a strategy is anything which defines your play. From a programming standpoint, a strategy would be the program you give to a robot to tell them how to play the game, no matter what your opponent does, sensible or moronic. I think overall they describe a strategy well. Second, "players make a finite number of clear decisions that have knowable outcomes". Technically, for a game *tree* there might need to be finite strategies, but continuous choices (say, setting a price) are used in continuous variations on the tree idea all the time.

Probably a minor point, the concept of utility long predates von Neumann and Morgenstern, although they did formalize the ideas of expected utility theory (which is used lots when randomness comes into play, and may be the single most criticized-but-essential thing in standard economic theory).

In the "Game Theory Games" section there are a number of simplifications. Some are more misleading than others.

- "Usually, game theory limits itself to games with only two players."

- This is wrong. Economists and game theorists look at games with many players all the time.
- "simulteneity"

- This is confusing. In a strict, technical sense it may be correct, but given all the time they spent previously talking about decision trees, simultaneous doesn't seem to capture it. There are kinds of game theory for simultaneous moves and kinds of game theory for sequences of moves.
- "utility"

- I think they've got this OK. Utility is a number which is supposed to capture overall happiness with a situation. There might be quibbles, nothing big and wrong here.
- "rational players"

- This is a simplification which may or may not be appropriate. The core of game theory does assume rational (in the economic sense) game players. That's always where people start the analysis. Behavioral game theory includes many ways to weaken this assumption and allow "boundedly rational" players. No single way of weakening rationality has won the day yet, and everyone always starts with rational until they see a specific reason to try something else. But there are "something else"s available.
- game theory game

- There's a bit of uncarefulness in the way they describe things which I wouldn't accept from a student in a game theory course, but ... I'm not positive how harmful it is in this context. In game theory there are
**games**and there are**solution concepts**. A**game**doesn't assume anything about rationality, it describes what possible choices are and how those choices lead to outcomes. A**solution concept**is a method for looking at a game and trying to figure out what would actually happen if that game were played. The most common solution concepts (Nash equilibrium and its variants) DO assume rationality. Anyway, this distinction between a game and a solution concept is important to a game theorist. I'm not positive if it's important to a game designer.

Getting to some meat, the descriptions of the three games which are classic games used for teaching various aspects of game theory (Cake Division, Playing for Pennies, Prisoner's Dilemma) are wrong or misleading in various ways. Best to just ignore them.

Overall, this is a bad way for a game designer to be introduced to game theory.