Basic Problem of Rational Choice.

Let's keep things simple and focus on one significant side of rational choice, rational ways to seek one's own welfare. What counts as a rational choice even in this sort of case is controversial.  However, a common approach begins from a simple idea.  If you have two choices, $50 and a kick in the head, unless something weird is going on, you should take the $50.  The reason is simple.  $50 is a good thing to have and a kick in the head is not.  To introduce a bit of jargon, having $50 gives you more utility than does a kick in the head.  

Were all our choices this straightforward, we would not need a theory of rational choice.  But things usually are not so straightforward for a number of reasons.  One reason is that we generally lack important information.  First, we often do not know all the actions that are available to us.  Second, even when we know a certain option is available, we generally do not know with any certainty what will occur if we take that option.  Even simple actions can have surprising consequences, and those consequences can spread into a largely unknowable future.  Finally, even if I know certain things will happen when I choose an action, I might have trouble determining whether it will produce more utility for me than other alternatives.  $50 beats a kick in the head, but it is less clear whether a trip to Paris beats a trip to Rome, or whether the purchase of a new computer beats a trip to Paris or Rome.   Those who have any doubts about this need only reflect on how common it is to think it is sooooo important to get some doodad or other only to realize, after acquiring it, that its not so important after all.  

So  we usually make choices in a state of uncertainty.  Unfortunately, much of this uncertainty cannot be avoided.  There is no way to be certain, in the real world, whether one has identified all one's alternatives, or even the most important ones.  It is impossible to know with any certainty the consequences of actions, especially in the long run.  And it is often very difficult to know which things will produce more and which less utility for me.  

The basic problem of rational choice is the problem of how to choose when faced with all this uncertainty.  Is choice nothing but groping in the dark or is there a genuinely rational way to proceed when surrounded by darkness?  

Expected Utility Theory.  

 As I said, it is not possible to do away with all the uncertainties involved with a choice.  But it is important to find strategies for dealing with some of this uncertainty.  One fruitful idea comes from gambling theory. Suppose that we take a relatively simple game, say, Roulette, in which much uncertainty is removed.  I do not actually know much about Roulette, and had to make a quick check on a gambling website to find out the little I know -- or think I know since I have no idea whether the website is accurate, one of those pesky cases of uncertainty.  When the wheel is spun, there are 38 pockets into which the ball might fall in the American version of the game.  Given that one is going to play, there are a number of choices one might make.  One may bet a sum of money on red or black, on odd or even, on any particular number, on pairs of numbers, and so on.  In Roulette, all your possible choices are known so that kind of uncertainty is gone.   Given the rules of the game, you also know the outcomes, that is, payoffs, if you win or lose.    For example, if you bet on a single number, and win, the payout is 35 to 1, while if you bet on any two adjoining numbers, and win, the payout is 17 to 1.  So the main thing that you do not know is where that little ball is going to land.  But even here you can have some knowledge, knowledge of the probability that any particular bet will win, assuming the wheel is fair.  You know that if you bet on the single number 5, your odds against winning are 37 to 1.  If you bet on adjoining numbers, your odds against winning are  about 18 to 1.  So what is a gambler to do?  How would a rational gambler decide on how to bet?  

A rational gambler probably would not be greatly impressed by the  advice embodied in Kenny Roger's nifty tune.   

 As it turns out, an approach to rationality for games like Roulette has been known for some time.  It is called 'expected utility theory'.  It is, at bottom, the theory of rational gambling.  To use it, certain things need to be known.  First, you must know the alternative actions available to you.  In Roulette, these are the possible bets.  Second, you must know the consequences of each alternative.  In Roulette, if you lose, you are out the sum you bet.  If  you win, you are paid according to a set schedule, say $35 if you bet $1 on a single number.  Third, you need to know the probabilities of each outcome.  In Roulette, this is easy to determine based on the number of slots in the wheel.  Expected utility theory tells you how to combine all this information so as to make a rational choice.   

Let us leave Roulette for the moment.  We will return to it shortly.  Suppose we have two choices, A and B.  We do not know for sure what will happen if we choose A or B, but we know the possible outcomes, their values if these outcomes occur, and the probabilities that they will occur.  To decide which to choose, we do a particular calculation, the expected utility calculation, and then choose the act with the greatest expected utility.  This calculation is simple.  For each of our options, we take the value of each possible outcome, multiply it times its probability, and add them all together.    

For example, suppose I have two acts open to me, Act A and Act B.  And suppose that there are two possible outcomes for each of the acts.  The values of these outcomes are represented in the chart below.   

The Choice Outcome 1 Outcome 2
Act A $5 $20
Act B $12 $2

If this is all the information I have, I cannot easily decide what to do.  Act A would be best if Outcome 2 occurs, but Act B would be best if Outcome 1 occurs.  To apply expected utility theory to decide which is my best choice, I need one more bit of information, the probabilities that Outcome 1 and Outcome 2 will occur.   

First, assume that whether we have Outcome 1 or 2 is determined by the flip of a fair coin.  So the odds are 50/50.  The expected utility calculation is shown to the right.  Since the expected utility of A is higher than that of B, A should be performed.  

EU of A = $5 (.5) + $20 (.5) = $12.50

EU of B = $12 (.5) + $2 (.5) = $7 

How does it work if the probabilities are not so simple?  Just to mix things up a bit, suppose that if you chose A, there is .9 probability of getting $5 and a .1 probability of getting $20.  And if you do B, there is a .6 probability of your getting $12 and .4 probability of your getting $2.  As complex as this sounds, expected utility is calculated in exactly the same way as show at the right.  Here we should choose to do B since its expected utility is greater than that of doing B.  

 

EU of A = $5 (.9) + $20 (.1) = $6.50

EU of B = $12(.6) + $2 (.4) = $8 

What Do the Expected Utility Numbers Mean?

This is a serious question.  After all, in our first example, if you choose act A, you will receive either $5 or $20.  You cannot win $12.50.  The expected utility of an action simply isn't one of the possible outcomes.  So what is it?  The idea is this.  Sticking with the first example, if you face this choice repeatedly, and if you repeatedly choose to do A, you will sometimes win $5 and you will sometimes win $20.  But over the long run, your average win will be $12.50.  If this choice is viewed as a game, then over the long run, you will probably be better off playing A rather than  B.  I say 'probably' and not 'certainly' since we are dealing with, well, probabilities.  But the probabilities are good enough to make A the rational choice.   Of course, in the real world, each 'game' may be played only once.  But if, as you go from game to game, you repeatedly make the choice that has the highest expected utility, even though the games change, over the long run, your odds of doing well will be best. 

Back to Roulette!

We began this discussion of rational choice with the example of Roulette.  To recap, in American Roulette, there are 38 slots for the ball to fall into. There are set bets, for example, you can bet on a single number, adjoining pairs, and so on.  And there are set payoffs if you win each of these bets.  Lets try an expected utility calculation for a particular bet, say, $1 on the number 5.  A win gives you $35 and your odds of winning are one in 38.  A loss means you are down $1 and the odds of this loss are 37 out of 38.  We calculate the expected utility of placing the bet as follows

EU of the bet = $35 (1/38) - $1(37/38) = -.053  

Note that rather than adding two numbers, I subtracted the second from the first.  That is because the second is a loss.  And the resulting expected utility of playing $1 on number five is a loss.  That is to say, if you repeatedly put $1 on the number 5, sometimes you will win and sometimes you will lose.  But over the long run, you will average a loss of about 5 cents.  So is there a better bet at the American Roulette table?  If you crunch the numbers, it turns out that given the payoffs from winning, and the odds of winning, virtually all bets have an expected utility of -.053.  The only exception, as I understand it, is a bet called 'Top Line' where you bet on the combination <0, 00, 1, 2, or 3>.  Here the expected utility is -.079.  Oops!  Hardly an improvement.  

What lesson is to be learned here?  Since the expected utility of not betting your $1 is $1, that is, you keep your money, a simple application of expected utility theory says you should not play.  As common sense would put it, in the long run, and usually in the short run, the house wins. For every winner, there are more, or bigger, losers.  Does this mean you should not play?  Hardly.  If your only goal is to make money, expected utility theory says to avoid Roulette.  But it is a simple fact that many do not just play to make money.  They play because they enjoy gambling.  Win or lose, it may be a good time.  

Make a Free Website with Yola.