## Monday, May 16, 2011

### Markov Chains

Today we're going to talk about gaming theory. What we're going to do is try to solve games, or small portions of them. Some games can be solved completely. This isn't necessarily true with other games; either because they're computationally too difficult to master, or because there are elements which can't easily be reduced to equations. By and large though, I'm not interested in completely solving games; when you have a computer that will play the game for you then necessarily you're not playing the game, which isn't an outcome I'd like to end with. So I'm only going to attempt to solve bits of games.

We start with RISK. If I take one soldier and attack one soldier, what are my odds of winning? Two dice are rolled, one attacking and one defending. this gives us 36 possible outcomes:

Every (vertical) pair of dice gives you one possible pair of rolls. Each of these pairs has the same chance of occurring, and each one defines a winner on that roll. As ties go to the defender, everything on that center line (and everything below it) constitutes a kill for the defender, and everything above it a kill for the attacker. So your gladiator has a 15/36 (about 42%) chance of winning, and a 21/36 (about 58%) chance of dieing gloriously and pointlessly.

So there you go; pure math has taught you the same thing that you knew instinctively about 30 seconds after playing your first game of RISK. And it only took several times as long. If you're so inclined, you can work out similar dice arrays for 2v1, 3v1, 2v2 and 3v2 battles. It'll take you a while though. But you'll still be left with problems; what happens if you have seven attacking five? It won't be over in just one round, so you'll be left with a new problem. And you can't exactly map out a dice array to number that. Well, you could, but it'd get incredibly large and complicated. So to do things properly, we'd have to resort to the math.

No no, come back! I promise I'll only describe the math! Please?

The solution is to use Markov chains. It's a way of diagramming out these sequences of probabilistic events. Here, let me show you one:

Russian Roulette; a classicer example there isn't. Every time you spin your chamber you've got a one in six chance of finding the bullet with your name on it. As long as you don't, you stay in the alive state, once you do, you transfer to the dead state. Once you're dead, you don't go back to the alive state. Unless you're Hindu. And right. But back on track here, you can set up a matrix with representations of your states and the probability of transfer from one to the next, and use that matrix to answer questions. Questions like "How long will I keep spinning this damn revolver before something interesting happens?" Or more germane to the blog, questions like "If I invade Siam, will I win the battle and how many guys will I have left over afterwards?" That's the sort of answer that can help you win games.

So get out your graphing calculators and-- Just kidding. I won't make you do any of the math. The short version is that the numbers work out the same for most values of attacking and defending; you're favored on the attack when you're rolling all three dice, and that you want to hit single unit defenders as often as possible. In general, you'll lose one unit for every defending unit you kill when they're rolling two dice. Oh, and bring as many armies in as you can manage so you don't get pushed below three dice.

Again, these aren't that useful of revelations. You probably figured most of them out on your own playing the game. I mean, attack when you are strong where he is weak? That's out of frikken Sun Tzu. The really useful one is the losing one for one bit. I mean, I get that impression playing the game, but I only trust my impressions so far. So having actual numbers helps me plan that sort of battle accordingly. But the theory ought to give more interesting results in a different context. Say, Axis and Allies.

Talking with Barbarossa we discussed two hypothetical battles tonight.

1) Russia uses it's transport to sneak attack Germany: Battleship bombard, infantry, tank and fighter against three tanks. Who wins? How many units die?
2) Germany attacks a British cruiser and battleship with two subs, a fighter and a tactical bomber. What are the odds that Germany gets out of it without losing any air units?

And on and on. The major conflict of the game is based on stretching your resources as far as they can go without bursting them. (Oh, and storywise a little altercation called the Second World War) So if you could make and solve a matrix for these battles you could figure out much more precisely your odds, and hopefully give yourself an edge in future games.

I say that generally, because so far my attempts at building an algorithm have come to naught; there are vastly more potential game states with a given number of Axis and Allies units than with the homogenous RISK units.

I'll let you know if I figure anything out.