Gambling mathematics

The mathematics of gambling is a collection of probability applications encountered in games of chance and can get included in game theory. From a mathematical point of view, the games of chance are experiments generating various types of aleatory events, and it is possible to calculate by using the properties of probability on a finite space of possibilities.

Experiments, events, and probability spaces
The technical processes of a game stand for experiments that generate aleatory events. Here are a few examples:

The occurrences could be defined; however, when formulating a probability problem, they must be done extremely carefully. From a mathematical point of view, the events are nothing more than subsets, and the space of events is a Boolean algebra. We find elementary and compound events, exclusive and nonexclusive events, and independent and non-independent events.

In the experiment of rolling a die:
 * Event {3, 5} (whose literal definition is the occurrence of 3 or 5) is compound because {3, 5}= {3} U {5};
 * Events {1}, {2}, {3}, {4}, {5}, {6} are elementary;
 * Events {3, 5} and {4} are incompatible or exclusive because their intersection is empty; that is, they cannot occur simultaneously;
 * Events {1, 2, 5} and {2, 5} are nonexclusive, because their intersection is not empty;

In the experiment of rolling two dice one after another, the events obtaining "3" on the first die and obtaining "5" on the second die are independent because the occurrence of the first does not influence the occurrence of the second event, and vice versa.

In the experiment of dealing the pocket cards in Texas Hold'em Poker: These are a few examples of gambling events whose properties of compoundness, exclusiveness, and independency are readily observable. These properties are fundamental in practical probability calculus.
 * The event of dealing (3♣, 3♦) to a player is an elementary event;
 * The event of dealing two 3's to a player is compound because it is the union of events (3♣, 3♠), (3♣, 3♥), (3♣, 3♦), (3♠, 3♥), (3♠, 3♦) and (3♥, 3♦);
 * The events "player 1 is dealt a pair of kings" and "player 2 is dealt a pair of kings" are nonexclusive (they can both occur);
 * The events player 1 is dealt two connectors of hearts higher than J and player 2 is dealt two connectors of hearts higher than J are exclusive (only one can occur);
 * The events player 1 is dealt (7, K) and player 2 is dealt (4, Q) are non-independent (the occurrence of the second depends on the happening of the first, while the same deck is in use).

Combinations
Games of chance are also good examples of combinations, permutations, and arrangements, which are met at every step: combinations of cards in a player's hand, on the table, or expected in any card game; combinations of numbers when rolling several dice once; combinations of numbers in lottery and Bingo; combinations of symbols in slots; permutations and arrangements in a race to be bet on and the like. Combinatorial calculus is an integral part of gambling probability applications. In games of chance, most of the gambling probability calculus in which we use the classical definition of probability reverts to counting combinations. The gaming events can be identified with sets, which often are sets of combinations. Thus, we can identify an event with a combination.

For example, in a five-draw poker game, the event at least one player holds a four-of-a-kind formation can be identified with the set of all combinations of (xxxxy) type, where x and y are distinct values of cards. This set has 13C(4,4)(52-4)=624 combinations. Possible combinations are (3♠ 3♣ 3♥ 3♦ J♣) or (7♠ 7♣ 7♥ 7♦ 2♣). These can be identified with elementary events that the event to be measured consists of.

Law of large numbers
When random events occur a large number of times, the chance will cancel each other out so that the arithmetic mean of the results of these events is very close to its mathematical term value in a probabilistic sense. For example, when a coin is tossed, it is random which side of the coin faces up when it falls, but when it happens enough times, the number of times the coin goes up on both sides is about one-half each. This is known as the law of large numbers.



Winning and losing gambling also behaves as a random event in a single person and for a short period, but in the long run, as long as the gambler has a negative rate of return, then losing is going to happen sooner or later as the game progresses. For the casino, as well as advantage players, as long as the win rate of the gambling play is positive, it is a sure win.

Principle of positive rate of return
The key to determining victory or defeat is the rate of return as determined by the gambling rules and strategy. The rate of return reflects the truth and nature of gambling. The principle of designing gambling rules is usually to make the casino win rate slightly more than 50%, which is reflected in a positive rate of return that is slightly greater than zero. Gambling is not luck, but a contest of intellect, strategy, and yield. The ultimate win of long-term gambling depends on the gambler's rate of return: if the rate of return is positive, the expected return is greater than zero and one can win; if the rate of return is negative, the expected return is less than zero and one cannot win. When the negative rate of return, "long gambling will lose" the role of the law of large numbers will increasingly appear. Professional gamblers, adhering to the principle of a positive rate of return, do not gamble for a long time and will lose the gambling game, only to gamble on a sure win. They are non-gamblers.

Law of small numbers bias
The law of large numbers means that when the sample is close to the overall, its probability will be close to the overall probability. The "law of small numbers bias" refers to the fact that the probability distribution of an event in a small sample is considered to be the overall distribution, thus exaggerating the representativeness of the small sample to the overall population. Another situation is the so-called "gambler's fallacy". For example, when flipping a coin, if it comes up heads 10 times in a row, one would think that the next time it comes up tails is very likely; in fact, the probability of coming up heads or tails is 0.5 each time, and it has nothing to do with how many times it has come up heads.

Ignoring the effect of sample size, believing that small and large samples have the same expected value, and replacing the correct probabilistic law of large numbers with the false psychological law of small numbers, is the cause of the great increase in people's gambling mentality. Casinos believe in the law of large numbers, and gamblers unconsciously apply the law of small numbers. The law of large numbers allows casinos to make money, and the law of small numbers allows gamblers to give money to casinos, and this is the logic of casinos' existence.

Casino advantage
The casino advantage is the advantage that the casino has over the gamblers for each type of gambling game in the casino.

Take the coin toss for example, the chances of heads and tails are equal, 50% each, if a player bets $10 on the coin landing heads up and they win, the casino pays them $10. If they lose, all the $10 is lost to the casino, in this case, the casino advantage is zero (the casino is certainly not stupid enough to open this game); but if they win, the casino only pays them $9, if they lose, all the $10 is lost to the casino. The average payout is (+$9−$10)/2 = −$0.50. The casino advantage in this case is $0.50 / $10 = 5%.

In any kind of game in a casino, the casino has a certain advantage over the gamblers, and only in this way can the casino ensure that it will continue to open in the long run. The casino advantage varies greatly from game to game, with some games having a low casino advantage and others having a high casino advantage. People who gamble a lot try not to play games with a high casino advantage.

Expectation and strategy
Games of chance are not merely pure applications of probability calculus and gaming situations are not just isolated events whose numerical probability is well established through mathematical methods; they are also games whose progress is influenced by human action. In gambling, the human element has a striking character. The player is not only interested in the mathematical probability of the various gaming events, but he or she has expectations from the games while a major interaction exists. To obtain favorable results from this interaction, gamblers take into account all possible information, including statistics, to build gaming strategies. You can easily pick a strategy that is conformable to your interests as a player by checking out the list in the next section. As can be seen, we have explained the basics of each method a gambler can test.

Even though the randomness inherent in games of chance would seem to ensure their fairness (at least with respect to the players around a table—shuffling a deck or spinning a wheel do not favor any player except if they are fraudulent), gamblers search and wait for irregularities in this randomness that will allow them to win. It has been mathematically proved that, in ideal conditions of randomness, and with negative expectation, no long-run regular winning is possible for players of games of chance. Most gamblers accept this premise, but still work on strategies to make them win either in the short term or over the long run.

House advantage or edge
Casino games provide a predictable long-term advantage to the casino, or "house" while offering the player the possibility of a large short-term payout. Some casino games have a skill element, where the player makes decisions; such games are called "random with a tactical element." While it is possible through skillful play to minimize the house advantage, a player rarely has sufficient skill to eliminate his inherent long-term disadvantage (the house edge or house vigorish) in a casino game. The common belief is that such a skill set would involve years of training, extraordinary memory, and numeracy, and/or acute visual or even aural observation, as in the case of wheel clocking in Roulette. For more examples see advantage gambling.

The player's disadvantage is a result of the casino not paying winning wagers according to the game's "true odds", which are the payouts that would be expected considering the odds of a wager either winning or losing. For example, if a game is played by wagering on the number that would result from the roll of one die, true odds would be 5 times the amount wagered since there is a 1/6 probability of any single number appearing. However, the casino may only pay 4 times the amount wagered for a winning wager.

The house edge (HE) or vigorish is defined as the casino profit expressed as a percentage of the player's original bet. In games such as Blackjack or Spanish 21, the final bet may be several times the original bet, if the player doubles or splits.

Example: In American Roulette, there are two zeroes and 36 non-zero numbers (18 red and 18 black). If a player bets $1 on red, his chance of winning $1 is therefore 18/38 and his chance of losing $1 (or winning -$1) is 20/38.

The player's expected value, EV = (18/38 x 1) + (20/38 x -1) = 18/38 - 20/38 = -2/38 = -5.26%. Therefore, the house edge is 5.26%. After 10 rounds, play $1 per round, and the average house profit will be 10 x $1 x 5.26% = $0.53. Of course, the casino can't win exactly 53 cents; this figure is the average casino profit from each player if it had millions of players each betting 10 rounds at $1 per round.

The house edge of casino games varies greatly with the game. Keno can have house edges up to 25% and slot machines can have up to 15%, while most Australian Pontoon games have house edges between 0.3% and 0.4%.

The calculation of the Roulette house edge was a trivial exercise; for other games, this is not usually the case. Combinatorial analysis and/or computer simulation are necessary to complete the task.

In games that have a skill element, such as Blackjack or Spanish 21, the house edge is defined as the house advantage from optimal play (without the use of advanced techniques such as card counting or shuffle tracking), on the first hand of the shoe (the container that holds the cards). The set of the optimal plays for all possible hands is known as "basic strategy" and is highly dependent on the specific rules, and even the number of decks used. Good Blackjack and Spanish 21 games have to house edges below 0.5%.

Online slot games often have a published return to player (RTP) percentage that determines the theoretical house edge. Some software developers choose to publish the RTP of their slot games while others do not. Despite the set-theoretical RTP, almost any outcome is possible in the short term.

Standard deviation
The luck factor in a casino game is quantified using standard deviation (SD). The standard deviation of a simple game like Roulette can be simply calculated because of the binomial distribution of successes (assuming a result of 1 unit for a win, and 0 units for a loss). For the binomial distribution, SD is equal to $$\sqrt{npq}$$, where $$n$$ is the number of rounds played, $$p$$ is the probability of winning, and $$q$$ is the probability of losing. Furthermore, if we flat bet at 10 units per round instead of 1 unit, the range of possible outcomes increases 10 fold. Therefore, SD for Roulette even-money bet is equal to $$2b\sqrt{npq}$$, where $$b$$ is the flat bet per round, $$n$$ is the number of rounds, $$p=18/38$$, and $$q=20/38$$.

After enough large number of rounds the theoretical distribution of the total win converges to the normal distribution, giving a good possibility to forecast the possible win or loss. For example, after 100 rounds at $1 per round, the standard deviation of the win (equally of the loss) will be $$2\cdot\$1\cdot\sqrt{100\cdot18/38\cdot20/38}\approx\$9.99$$. After 100 rounds, the expected loss will be $$100\cdot\$1\cdot2/38\approx\$5.26$$.

The 3 sigma range is six times the standard deviation: three above the mean, and three below. Therefore, after 100 rounds betting $1 per round, the result will very probably be somewhere between $$-\$5.26-3\cdot\$9.99$$ and $$-\$5.26+3\cdot\$9.99$$, i.e., between -$34 and $24. There is still a ca. 1 to 400 chance that the result will be not in this range, i.e. either the win will exceed $24, or the loss will exceed $34.

The standard deviation for the even-money Roulette bet is one of the lowest out of all casinos games. Most games, particularly slots, have extremely high standard deviations. As the size of the potential payouts increase, so does the standard deviation.

Unfortunately, the above considerations for small numbers of rounds are incorrect, because the distribution is far from normal. Moreover, the results of more volatile games usually converge to the normal distribution much more slowly, therefore much more huge number of rounds are required for that.

As the number of rounds increases, eventually, the expected loss will exceed the standard deviation, many times over. From the formula, we can see the standard deviation is proportional to the square root of the number of rounds played, while the expected loss is proportional to the number of rounds played. As the number of rounds increases, the expected loss increases at a much faster rate. This is why it is practically impossible for a gambler to win in the long term (if they don't have an edge). It is the high ratio of short-term standard deviation to expected loss that fools gamblers into thinking that they can win.

The volatility index (VI) is defined as the standard deviation for one round, betting one unit. Therefore, the VI for the even-money American Roulette bet is $$2\sqrt{18/38\cdot20/38}\approx0.9986$$.

The variance $$v$$ is defined as the square of the VI. Therefore, the variance of the even-money American Roulette bet is ca. 0.9972, which is low for a casino game. The variance for Blackjack is ca. 1.2, which is still low compared to the variances of electronic gaming machines (EGMs).

Additionally, the term of the volatility index based on some confidence intervals are used. Usually, it is based on the 90% confidence interval. The volatility index for the 90% confidence interval is ca. 1.645 times as the "usual" volatility index that relates to the ca. 68.27% confidence interval.

It is important for a casino to know both the house edge and volatility index for all of their games. The house edge tells them what kind of profit they will make as percentage of turnover, and the volatility index tells them how much they need in the way of cash reserves. The mathematicians and computer programmers that do this kind of work are called gaming mathematicians and gaming analysts. Casinos do not have in-house expertise in this field, so they outsource their requirements to experts in the gaming analysis field.

Bingo probability
The probability of winning a game of Bingo (ignoring simultaneous winners, making wins mutually exclusive) may be calculated as:

P(Win)=1-P(Loss) $$ since winning and losing are mutually exclusive. The probability of losing is the same as the probability of another player winning (for now assuming each player has only one Bingo card). With $$n$$ players taking part: $$ P(Loss)=P(P_2\mbox{ or }P_3 \mbox{ or } ... P_{n-1}\mbox{ or }P_n ) $$ with $$n$$ players and our player being designated $$P_1$$. This is also stated (for mutually exclusive events) as $$P(Loss)=P(P_2)+P(P_3)+...+P(P_n)$$.

If the probability of winning for each player is equal (as would be expected in a fair game of chance), then $$P(P_1)=P(P_2)=...=P(P_n)$$ and thus $$P(Loss)=(n-1) P(P_1)$$ and therefore $$P(Win)=P(P_1)=1-(n-1) P(P_1)$$. Simplifying yields
 * $$P(P_1)=1/n$$

For the case where more than one card is bought, each card can be seen as being equivalent to the above players, having an equal chance of winning. $$P(C_1)=1/n_C$$ where $$n_C$$ is the number of cards in the game and $$C_1$$ is the card we are interested in.

A player ($$P_1$$) holding $$m$$ cards therefore will be the winner if any of this cards win (still ignoring simultaneous wins):
 * $$P(P_1)=P(C_1)+P(C_2)+...+P(C_m)=m/n_C$$

A simple way for a player to increase his odds of winning is therefore to buy more cards in a game (increase $$m$$).

Simultaneous wins may occur in certain game types (such as online Bingo, where the winner is determined automatically, rather than by shouting "Bingo" for example), with the winnings being split between all simultaneous winners. The probability of our card, $$C_1$$, winning when there is either one or more simultaneous winners is expressed by:

P(C_1)=P(w) \frac{w}{n_C} $$ where $$P(w)$$ is the probability of there being $$w$$ simultaneous winner (a function of the game type and number of players) and $$\frac{w}{n_C}$$ being the (fair) probability that $$C_1$$ is one of the winning cards. The overall expected value for the payout (1 representing the full winning pot) is therefore:


 * $$E=\frac{1}{1}\frac{1}{n_C}P(1)+\frac{1}{2}\frac{2}{n_C}P(2)+...+\frac{1}{n_C}\frac{n_C}{n_C}P(n_C)$$
 * $$E=\frac{1}{n_C}(P(1)+P(2)+...+P(n_C))$$

Since, for a normal bingo game, which is played until there is a winner, the probability of there being a winning card, either $$P(1)$$ or $$P(2)$$ or ... or $$P(n_C)$$, and these being mutually exclusive, it can be stated that


 * $$P(1)+P(2)+...+P(n_C)=1$$

and therefore that
 * $$E=\frac{1}{n_C}$$

The expected outcome of the game is therefor not changed by simultaneous winners, as long as the pot is split evenly between all simultaneous winners. This has been confirmed numerically.

To investigate whether it is better to play multiple cards in a single game or to play multiple games, the probability of winning is calculated for each scenario, where $$m$$ cards are bought.
 * $$P(win)_{multiplecards}=\frac{m}{m+n-1}$$

where n is the number of players (assuming each opposing player only plays one card). The probability of losing any single game, where only a single card is played, is expressed as:
 * $$P(loss)_{game}=1-P(win)=1-\frac{1}{n}$$

The probability of losing $$m$$ games is expressed as:
 * $$P(loss)_{multiplegames}=(1-\frac{1}{n})^m$$

The probability of winning at least one game out of $$m$$ games is the same as the probability of not losing all $$m$$ games:
 * $$P(win)_{multiplegames}=1-(1-\frac{1}{n})^m$$

When $$m=1$$, these values are equal:
 * $$P(win)_{multiplecards}=P(win)_{multiplegames}$$

but it has been shown that $$P(win)_{multiplegames}>P(win)_{multiplecards}$$ for $$m>1$$. The advantage of $$P(win)_{multiplegames}$$ grows both as $$m$$ grows and $$n$$ decreases. It is therefore always better to play multiple games rather than multiple cards in a single game, although the advantage diminishes when there are more players in the game.