Thursday, November 23, 2006

The Kelly Criterion

When I read William Poundstone’s Fortunes Formula: The Untold Story of the Scientific Betting System that Beat the Casinos and Wall Street, I was intrigued by his discussion of the Kelly Criterion for gambling and investing. But since the book was written for a general audience, I was not satisfied with the mathematical level of his description of the Kelly Criterion. So I decided to read more about it to see if I could understand the math a bit better. Below is the result of my endeavors. I think I understand the math now, but that still does not answer the question of whether the Kelly Criterion is appropriate for the average person to use for gambling or investing. But at least it helps to understand the math. (To understand the math in this post, you need a first course in calculus.)

The Kelly Criterion

Suppose you are gambling. You are going to play a sequence of games. If you win a game, you win W dollars for each dollar bet. If you lose, you lose your bet. For each game, the probability is p that you will win and q (equal to 1 – p) that you will lose. You start out with a bankroll of B dollars. You bet some percentage, f, of your bankroll on the first game, that is you bet fB dollars. After the first game, your new bankroll is B’ dollars depending on whether you win or lose. You then bet the same percentage, f, of your new bankroll on the second game, that is you bet fB’ dollars. And so on. For each game, you bet the same percentage, f, of your current bankroll on that game. The problem is: what value of f should you chose. The value of f that maximizes your expected gain per game is called the Kelly Criterion.

To see why this problem is not as easy as it might sound, suppose you can play a game where you flip a perfect coin, with probability of 1/2 for heads and 1/2 for tails.

If you bet on heads and it comes up heads, you win $100 for each dollar bet; if it comes up tails, you lose your bet. What fraction of your bankroll should you bet? Since the odds are overwhelmingly in your favor, you might first think you should bet your entire bankroll on each game. However, if you do, you will eventually lose a game and thus lose all your money. So what fraction of your money should you bet? If you bet too much, you will go bankrupt. If you bet too little you will not make as much money as you could from this excellent opportunity. We will give the answer later.

Winning W Dollars Or Losing Your Bet

To see how the betting works in the more general case where you win W dollars for each dollar bet with probability p, and lose your bet with probability q, suppose your initial bankroll is B. In the first game you bet fB. If you win, your new bankroll is

B’ = B + WfB

Therefore on the second game you will bet

fB’ = f(1 + fW)B

If you are lucky enough to win again, your new bankroll will be

B” = (1 + fW)B’ = (1 + fW)^2

If you then lose the third game, your bankroll will be

B’’’ = (1 – f)B” = (1 + fW)^2 * (1 – f) B

and so on. Suppose that after n games, you have won w times and lost l times. Then your bankroll will be

(1 + fW)^w * (1 – f)^l * B

Note that the final value of your bankroll does not depend on the order in which you win and lose games, just on how many games you win and lose.

Thus the gain in your bankroll after n games is

(1 + fW)^w * (1 – f)^l

The expected value of this gain per game, G, is the limit as n approaches infinity of the nth root of this gain (the geometric mean),

G = lim (1 + fW)^(w/n) * (1 – f)^(l/n))

Since the limit as n approaches infinity of w/n is p and the limit of l/n is q.

G = (1 + fW)^p * (1 – f)^q

Note that in our original example where you can lose your entire bet in one game, if you bet your entire bankroll on each game (that is, you select f = 1), then G = 0; the expected gain per game is 0.

The expected value of your bankroll after n games is

G^n * B

Because this gain is exponential, the expected rate of gain per game, R, (sometimes called the growth rate) is the logarithm of the expected gain per game.

R = log G = p * log(1 + fW) + q * log (1 – f)

(All logarithms in this paper are natural logarithms to the base e.)

We want to find the value of f that maximizes this expected rate of gain per game.

(Note that maximizing the expected rate of gain also maximizes the expected final value of your bankroll.)


Taking the derivative with respect to f and setting that derivative equal to 0 gives

pW / (1 + fW) – q / (1 – f) = 0

Solving for f gives

f = (pW – q) / W

This value of f is called the Kelly Criterion. People sometimes say that this equation for the Kelly Criterion can be interpreted as Edge/Odds. (pW - q) is your edge (how much you expect to make per game), and W is the odds.

Note that

f = (pW – q) / W = p – q / W

so that f is never more than p, no matter how favorable the odds.

In our first example, where W = 100 and p = 1/2 the value of f is

f = (pW – q) / W = (50 - .5) / 100 = .495

Your expected gain per game is then

G = (1 + .495* 100) ^(1/2) * (1 – .495) ^(1/2) = sqrt(50.5 * .505) = 5.05

One other interesting special case is when. You either win 1 or lose 1.

f = p – q = 2p - 1

For example, if p = 1/2 (when W = 1), then f = 0. You have no advantage and you shouldn’t bet anything.

Winning W Dollars or Losing L Dollars on Each Bet (More Like Investing)

Now consider a more general case, closer to investing, in which on each bet (investment) you might win W dollars or lose L dollars for each dollar bet. (If you assume you cannot lose more than you bet, then) As before the probability that you will win is p and that you will lose is q (equal to 1 - p). Also as before, you always bet some percentage f of your current bankroll.

Now the expected gain per game is

G = (1 + fW)^p * (1 – fL)^q

Again to find the value of f that maximizes the expected rate of gain per game we first take the log of G and then find the value of f that maximizes that log

R = log G = p * log(1 + fW) + q * log (1 – f L)

Taking the derivative with respect to f and setting that derivative equal to 0

pW / (1 + fW) = qL / (1 – fL) = 0

Solving for f gives

f = (pW – qL) / WL

which is the Kelly Criterion for this case.

As a simple example, suppose you play a game in which you have a probability of 1/2 of doubling your bet and a probability of 1/2 of losing 60% of your bet. In this case

f = (.5 * 1 – .5 *6) / 1 * .6 = 1/3

For this value of f, the expected gain per game is

G = (1 + .33)^(1/2) * (1 – (.33 * .60))^(1/2) = sqrt(1.33 * .08) = 1.0328

To see how this works out in a specific situation, suppose your bankroll is 100. You play two games using this value of f, winning the first and losing the second. On the first game you bet 33.33, and after winning, your bankroll is 133.33. On the second game you bet 1/3 of your bankroll, which is 44.44. You lose 60% of that, which is 26.67. Your final bankroll is

133.33 – 26.67 = 106.67

which could have been calculated directly as

(1 + .33)*(1 – (.33*.60)*100 = 106.67

If you had lost the first game and won the second, the final result would have been the same.

By contrast, if you had bet your entire bankroll on each game, after the first (winning) game your bankroll would be 200, and after the second (losing) game your bankroll would be 80. Your expected gain per game would be negative.

N Possible Outcomes

Now consider a still more general case, in which you make a bet and there are n possible outcomes, xi each with probability pi (For example, you might be buying a stock and keeping it for some period of time, and at the end of that period of time, there might be n possible values that that stock might have: x1, x2, …,xn)) In this case

G = Prod((1 + f xi)^pi)

Some values of xi might be positive and other values might be negative.

Taking the log of G gives

R = log G = Sum(pi * log(1 + f xi))

The math would now get a bit complicated. However, if f xi <<>

log(1 + z) = z – z^2 / 2 + z^3 / 3 – z^4 / 4 + ….

to obtain

R = log G = f * Sum(pi xi) – f^2 * Sum(pi xi^2 / 2)

Taking the derivative with respect to f and setting that derivative equal to 0 gives

Sum(pi xi) – f * Sum(pi xi^2) = 0

Solving for f gives

f = Sum(pi xi) / Sum (pi xi^2)

The numerator is the mean of the outcomes, and the denominator is close to the variance of the outcomes (actually slightly larger than the variance), which is

Sum(pi xi^2) – (Sum(pi xi))^2

Varying the Kelly Criterion

The value of f corresponding to the Kelly Criterion often leads to a large amount of volatility in the bankroll. For example, the probability of the bankroll dropping to 1/n of its initial value at some time in an infinite sequence of bets is 1/n. Thus, for example, there is a 50% chance that the bankroll will drop to half of its value at some time. As another example, there is a 1/3 chance that the portfolio will half before it doubles.

Therefore many people propose using a value of f corresponding to one half of the Kelly Criterion (sometimes called a half Kelly), thus obtaining somewhat slower growth but with less volatility. Half Kelly betting yields about 75% of the return of full Kelly. Another reason for using half Kelly is that people often overestimate their edge.

You should also note that if you bet more than the Kelly Criterion, the return is less and the volatility is even greater. If you bet twice the Kelly Criterion, the growth is zero. If you bet more than twice the Kelly Criterion, the growth is negative.

Some Final Comments

The Kelly Criterion is actually quite controversial, as you can see by just Googling the words “Kelly Criterion.” You should realize, however, that the mathematics in this paper is not controversial. What is controversial is whether you should actually use the Kelly Criterion when you gamble (or invest) because you are going to make only a relatively short sequence of bets, compared with the infinite sequence used in the math, and there might be a significant risk because there is a wide variability in the final bankroll for such a relatively short sequence of bets. We do not make any comments on that controversy

References

http://www.geocities.com/gbosmis/MM1.htm

http://en.wikipedia.org/wiki/Kelly_criterion

http://www-stat.wharton.upenn.edu/~steele/Courses/434F2005/Context/Kelly%20Resources/

KellyIndex.htm

Poundstone, William, “Fortunes Formula: The Untold Story of the Scientific Betting System that Beat the Casinos and Wall Street,” Hill and Wang, New York, NY, 2005

Kelly, John L, Jr., A New Interpretation of Information Rate, Bell Systems Technical Journal, Vol. 35, pp917-926, 1956.




0 Comments:

Post a Comment

<< Home