How To Pick a Bet Size

There are a number of practical concerns when it comes to gambling.  The first is finding a place to gamble.  This is becoming less and less of a problem every year as more places allow casinos, but it is still a large issue if you’re interested in sports betting.  Once you’ve figured that out, you need to figure out what you’re betting on.  Sports betting is one of the few casino games that it is possible to beat.  Slot machines and roulette, for example, cannot be beat unless you find a faulty machine or a casino running a poorly-planned promotion (perhaps Wesley Snipes Day, when they pay 3 to 1 if you bet on black), which will happen very rarely.  Sports betting can be beat if you know more than the general public that set the lines, and can thus beat the odds that the sports book places on the outcome you choose.  I try to beat the odds using my model (there’s a link in the bottom of the banner).  And once you’ve picked your bet, you have to pick the amount you want to bet.

I use the Kelly criterion when choosing a bet size.  This isn’t the only way to go, but (under its assumptions) the Kelly criterion maximizes long-term bankroll growth.  For this method, you need to know three things: your bankroll, the odds on your bet, and the probability of you winning that bet.  You should always know your bankroll; it’s the amount of money you’re willing to wager.  This should always be money that you can afford to lose (that’s your public service announcement for the day).  Knowing the probability of winning your bet is sometimes difficult.  If you have no idea what you’re doing, the probability is whatever chance is; if you’re picking from two options it would be 50%, 33% for three options, etc.  With my model, I can use its historical accuracy against the spread.  But I imagine that many bettors don’t have the probability of winning, say, a horse race bet.  In that case you have to guess, hopefully realistically.  Finally, the odds on your bet come from the sports book.  For example, right now Bodog offers the Packers -14 at even odds against the Bills, or you can take the Bills +14 at -120.  ‘Even’ and ‘-120’ aren’t odds, so we need to transform them.  Even means what it sounds like; you stand to win the amount you bet.  So if you bet $100, you would win $100.  A negative value tells you how much you’d have to bet to win $100.  So -120 means you have to bet $120 to win $100.  And a positive value tells you how much you would win if you bet $100.  So +130 means you would win $130 if you bet $100.  These numbers can be transformed into odds by dividing the amount to be won by the amount bet.  So even would be $100/$100 = 1; -120 -> $100/$120 = .8333, and +120 -> $120/$100 = 1.2 (technically each should be expressed as ‘__ to 1’, as in even odds is 1 to 1).

Once you have all three numbers, you can calculate your bet size.  The Kelly criterion says that you should bet a fraction of your bankroll on each of a number of successive, independent bets.  That fraction is f=(b*p-q)/b, where b is the odds, p is the probability of winning, and q is the probability of losing (which is the same as 1-p).  For example, let’s say you’re flipping a coin and betting on it coming up heads.  This should happen 50% of the time, so p =.5 and q=.5.  If you were getting even odds, b=1 and the fraction f = (1*.5 – .5)/1 = 0.  Kelly would say you should not bet, basically because you aren’t expected to win enough.  But let’s say you found someone who offered you 1.2 to 1 on heads.  Then f = (1.2*.5 – .5)/1.2 = .08333.  You should bet 8.3% of your bankroll on each coin flip.  So if you had $1000 for this game, you would bet $83.33 every time.  Alternatively, you can solve the equation for p to find out how accurate you must be for a given odds to determine if you should bet or not.  If the odds are at -110, which converts to b=.909, you need to be correct better than 1/(1+b) = 52.4% of the time according to the Kelly criterion.

So how would I choose my football bet size?  First, I have my bankroll.  Then I figure out how many games I’m betting on in a week and divide my bankroll by that number.  So if I had $1000 and was betting on 10 games, I would have a bankroll of $100 per game.  Then I would look at the odds for each game and use the Kelly criterion to determine the percentage of $100 to bet on that game.  This ensures that I never bet more money than I have, because I scale my total bankroll down to a game-by-game bankroll, and the Kelly criterion is never greater than 1.  However, I then tend to overestimate the probability that I’m correct because otherwise the bet amounts would be so small as to not be worthwhile unless you had a huge bankroll.  For example, with $100 to bet on 10 games, a win probability of 55%, and odds of -110 on each, you would only bet $5.50 on each game.  Betting more than this increases your risk of going broke at some point; on the other hand, you would have to be supremely unlucky to go broke risking $55 a week when you have $1000 to use.  So I would view the Kelly criterion as a benchmark or guide as to if you should be betting or not (a fraction f of 0 or less means don’t bet), but it can be adjusted based on how much risk you’re willing to take on.

This entry was posted in Uncategorized. Bookmark the permalink.

5 Responses to How To Pick a Bet Size

  1. william tsai says:

    I’m not sure Kelly criterion works this way. Your bank roll is $1000, and you would be able to apply the calculation if the bets were successive rather than simultaneous. You would have to adjust the next bet based on your bank roll after the last bet. I’m guessing you’re underestimating the optimal bet size by a factor of 10. However, to bet the full Kelly percentage makes the bank roll very volatile and vulnerable to any mis-estimate of win probability, so I’d be careful of using the full Kelly.

    • Alex says:

      Thanks William. I figured the simultaneous bets changed things, but I don’t know how much. Do you have a reference or anything I could look at for that?

  2. Ransom says:

    I believe that for simultaneous bets, the following applies:

    Take bankroll B. Proportion your total bet between the bets in any manner you choose. For two bets, set proportions x1,x2 such that x1+x2=1, and in your case this would be x1=f1/(f1+f2) and x2=f2/(f1+f2), betting an amount on each game proportional to its kelly factor. Then, calculate the Kelly factor as f=(x1*(p1*b1-q1)+x2*(p2*b2-q2))/(x1*b1+x2*b2). That is, you weight each game’s expected return by its proportion of the betting amount, and you weight the payoff factor in the denominator by your chosen proportions and the individual game’s payoff. Then, you assign a bet to game n via this formula: bet on game n = f * B * x_n.

    This produces slightly different values than betting in the manner you described. E.g., game with .76 chance to win 1.1, second game with .92 chance to win .36, you would have individual kellys of .54 and .69, and bets of $54 and $69 (assigning $100 to each game like you said), whereas the combination described above yields a kelly of .58, for a pool of $116, giving the first game $51 and the second $65.

    Now, I believe the optimal strategy (assuming correct estimates of probabilities, or at least you’re equally likely to be wrong in either direction about a certain probability) is to only bet on the game with the best individual kelly factor, as it will bring the highest possible return. So in the example above that would be betting $118 on the second game. Would love to see the results if you use the “bet on the best kelly game only” or the (I believe) slightly more optimal than your current proportioning strategy I described above.

  3. Ransom says:

    Oops, meant $138 on the second game as the sole bet.

  4. Ransom says:

    So your intuition is correct in that you are proportioning more money towards games that have higher kelly values, I just think that your strategy is betting a bit too much compared to optimal kelly strategy with that proportioning scheme. And beyond that any movement towards higher kelly values would produce better results, I would imagine. Would like to see something like betting 10 units (x_1=10/55) on the highest kelly bet (moneyline,spread,OU,whichever it works out to be), and 9 (x_2=9/55) on the next best, etc. down the line to 1 (x_10=1/55) unit on the 10th best kelly factor bet. I think that would combine some aspects of more optimal betting strategy and also you get to have some fun by betting on more than just the very best kelly factor game.

Leave a reply to Ransom Cancel reply