Expected Value
Probability × Payoff
Definition
Expected value (EV) is the sum of all possible values of a random variable, each multiplied by its probability of occurrence. It represents the average outcome if an action were repeated many times.
EV = (Probability of Outcome 1 × Value of Outcome 1) +
(Probability of Outcome 2 × Value of Outcome 2) +
...
“Take the probability of loss times the amount of possible loss from the probability of gain times the amount of possible gain.” — Blaise Pascal (Pascal’s Wager)
Simple Examples
Coin Flip Game
- Heads: Win $10 (probability 0.5)
- Tails: Lose $5 (probability 0.5)
EV = (0.5 × $10) + (0.5 × -$5) = $5 - $2.50 = $2.50
Interpretation: Play this game 100 times, expect to gain ~$250.
Lottery Ticket
- Cost: $2
- Win $1 million: Probability 1 in 10 million
- Win nothing: Probability 9,999,999 in 10 million
EV = (0.0000001 × $999,998) + (0.9999999 × -$2)
EV = $0.10 - $2.00 = -$1.90
Interpretation: For every 1.90.
Business Decision
Launch a product:
- 30% chance of $1M profit
- 40% chance of $100k profit
- 30% chance of $200k loss
EV = (0.3 × $1M) + (0.4 × $100k) + (0.3 × -$200k)
EV = $300k + $40k - $60k = $280k
Interpretation: Average expected profit is $280k.
When to Use Expected Value
Good Uses
- Repeated decisions (investing, business)
- Portfolio decisions (many bets)
- Risk quantification
- Comparing options mathematically
Poor Uses
- One-time irreversible decisions
- Outcomes where magnitude matters more than frequency
- Situations with catastrophic tail risks
- When utility isn’t linear with money
Beyond Money: Utility
Diminishing Marginal Utility
- 0 than to a billionaire
- Losing 10,000 feels good (loss aversion)
Expected Utility
Replace dollar values with “utility” (subjective value):
- $1M might = 500 utils for a poor person
- $1M might = 10 utils for a billionaire
Same EV in dollars, different expected utility.
The Kelly Criterion
How much to bet when you have an edge?
f* = (bp - q) / b
Where:
f* = fraction of bankroll to bet
b = odds received (e.g., bet $1 to win $b)
p = probability of winning
q = probability of losing (1 - p)
Key insight: Even with positive EV, bet too much and you go broke (gambler’s ruin).
Common Mistakes
Ignoring Variance
Two investments with same EV:
- A: 50% chance of 0 (EV = $50)
- B: 100% chance of 50)
A has higher variance. Risk-averse people prefer B.
Overconfidence in Probabilities
EV is only as good as probability estimates. We often:
- Overestimate our knowledge
- Ignore unknown unknowns
- Use point estimates when ranges are appropriate
Tail Risks
Low probability × catastrophic outcome = negative EV we ignore
Example: “It probably won’t happen” × “Company goes bankrupt” = Don’t do it
In Real Decisions
When EV Is Positive
You should generally take positive EV bets, but consider:
- Can you afford the loss?
- Is it repeatable?
- Are you calibrated on probabilities?
When EV Is Negative
Avoid negative EV in repeated games:
- Casinos (house edge)
- Lottery (state takes 40%+)
- Insurance (company must profit)
Exception: Insurance has negative EV in dollars but positive EV in utility (avoiding ruin).
Related Concepts
- Asymmetric Risk — When upside/downside aren’t balanced
- Margin of Safety — Engineering in error margin
- Optimism Bias — Inflating probabilities of success
- Loss Aversion — Asymmetric utility
- Information Theory — Quantifying uncertainty
References
- von Neumann, J. & Morgenstern, O. (1944). Theory of Games and Economic Behavior
- Thorp, E.O. (2006). The Kelly Capital Growth Investment Criterion
- Mauboussin, M. (2012). The Success Equation
Calculate carefully, bet wisely, repeat often. 🎲