Introduction: A Problem 300+ Years Old
Imagine you're checking coats at a fancy restaurant. You have 10 coats and 10 coat check numbers. At the end of the night, you randomly return the coats to customers. What's the probability that no one gets their own coat back?
This is Montmort's famous matching problem, first studied by Pierre RΓ©mond de Montmort in 1708. It's one of the oldest and most elegant problems in probability theory, and the answer is surprisingly elegant.
In this post, we'll explore the problem, solve it, and see why it matters today.
Historical Context: Pierre RΓ©mond de Montmort
Pierre RΓ©mond de Montmort (1678-1719) was a French mathematician who published Essay d'analyse sur les jeux de hazard (Essay on the Analysis of Games of Chance) in 1708. This work is considered one of the founding texts of probability theory.
Montmort was part of the early wave of mathematicians (alongside Pascal, Fermat, Huygens) who were formalizing probability. His essay contained many problems about card games and probability, including the matching problem, also known as the problème des rencontres (problem of coincidences).
The Problem Statement
The Matching Problem: A person shuffles n letters into n envelopes randomly. What is the probability that none of the letters are in the correct envelope?
This is equivalent to asking: What's the probability of a complete derangement?
A derangement is a permutation where no element appears in its original position. The number of derangements of n objects is denoted D(n) or !n (subfactorial).
Small Examples
Case: n = 2 (2 letters, 2 envelopes)
Possible arrangements:
- Letter 1 in Envelope 1, Letter 2 in Envelope 2 β NOT a derangement
- Letter 1 in Envelope 2, Letter 2 in Envelope 1 β IS a derangement β
P(derangement) = 1/2 = 50%
Case: n = 3 (3 letters, 3 envelopes)
Total arrangements: 3! = 6
Derangements:
- (1,2,3) β (2,3,1) β
- (1,2,3) β (3,1,2) β
P(derangement) = 2/6 = 1/3 β 33.3%
Case: n = 4 (4 letters, 4 envelopes)
Total arrangements: 4! = 24
Derangements: 9
P(derangement) = 9/24 = 3/8 = 37.5%
The Formula: Subfactorial
The number of derangements of n objects is:
Or more compactly:
This is called the subfactorial and is denoted !n.
Why This Formula?
This formula comes from the inclusion-exclusion principle:
- Start with n! total permutations
- Subtract permutations where at least one letter is correct
- Add back permutations where at least two letters are correct (to avoid over-subtracting)
- Continue alternating until all n letters are considered
The Probability Formula
The probability that all n letters are in wrong envelopes:
Surprising Result: As n β β
Here's the amazing part. As n gets large:
Where e β 2.71828 is Euler's number (the base of natural logarithms).
This means that no matter how many letters and envelopes you have, the probability that none match approaches roughly **36.8%** β and it converges very quickly!
Table: Probabilities by n
| n (letters) | D(n) (derangements) | n! (total) | P(derangement) | % Probability |
|---|---|---|---|---|
| 1 | 0 | 1 | 0 | 0% |
| 2 | 1 | 2 | 0.5 | 50.0% |
| 3 | 2 | 6 | 0.333 | 33.3% |
| 4 | 9 | 24 | 0.375 | 37.5% |
| 5 | 44 | 120 | 0.367 | 36.7% |
| 10 | 1,334,961 | 3,628,800 | 0.3679 | 36.79% |
| 20 | 5.9 Γ 10^18 | 2.4 Γ 10^18 | 0.367879 | 36.7879% |
| β | β | β | 1/e | 36.7879% |
Notice how quickly it converges to 1/e!
Why Does It Converge to 1/e?
The Taylor series expansion of e^(-1) is:
But our derangement probability is:
For large n, we're computing the first n+1 terms of the Taylor series for e^(-1). As n β β, we get the full series, which equals 1/e exactly!
Python Implementation
import math
from math import factorial
def derangements(n):
"""Calculate number of derangements of n objects"""
if n == 0:
return 1
if n == 1:
return 0
result = 0
for k in range(n + 1):
result += factorial(n) * ((-1) ** k) / factorial(k)
return int(result)
def derangement_probability(n):
"""Calculate probability of complete derangement"""
d = derangements(n)
return d / factorial(n)
# Test
for n in [2, 3, 4, 5, 10, 20, 100]:
prob = derangement_probability(n)
print(f"n={n:3d}: P(derangement) = {prob:.6f} ({prob*100:.4f}%)")
# Compare with 1/e
one_over_e = 1 / math.e
print(f"\n1/e = {one_over_e:.6f}")
# Output:
# n= 2: P(derangement) = 0.500000 (50.0000%)
# n= 3: P(derangement) = 0.333333 (33.3333%)
# n= 4: P(derangement) = 0.375000 (37.5000%)
# n= 5: P(derangement) = 0.366667 (36.6667%)
# n= 10: P(derangement) = 0.367857 (36.7857%)
# n= 20: P(derangement) = 0.367879 (36.7879%)
# n=100: P(derangement) = 0.367879 (36.7879%)
#
# 1/e = 0.367879
Real-World Applications
1. Coat Check Problem (Original)
Restaurant coats randomly returned to customers. Probability no one gets their own coat: ~36.8%.
2. Secret Santa
In a Secret Santa gift exchange, you want everyone to get someone else's gift. Probability of success with random assignment: ~36.8%.
3. Cryptography
Derangements are used in analyzing the security of permutation ciphers and analyzing randomness in cryptographic systems.
4. Quality Control
When inspecting n items, what's the probability that inspector marks are completely wrong about original labels?
5. DNA Sequencing
Biologists use derangements when analyzing random mutations where no position retains its original genetic marker.
6. Job Assignment
Company has n employees and n jobs. Random assignment results in no one getting their preferred job: ~36.8% probability.
The Beautiful Connection to e
- Combinatorics: Counting permutations with constraints
- Probability: Finding the probability of a specific outcome
- Analysis: The Taylor series expansion of e^(-1)
Montmort's Legacy
Montmort's work on this problem:
- Pioneered the use of inclusion-exclusion principle
- Connected combinatorics to probability rigorously
- Influenced the development of probability theory in the 18th century
- Inspired later work by Euler, who further generalized the results
Fun Fact: Quick Approximation
So if you have 100 letters:
D(100) β 100! / e β 100! / 2.71828
Conclusion
Montmort's matching problem is a beautiful example of classical probability. It's simple to state, elegant to solve, and connects to deep mathematics (the number e!). The fact that the probability converges to 1/e β 36.8% regardless of the size shows the power of mathematical patterns.
Over 300 years later, derangements still appear in cryptography, algorithms, and computer science. Montmort's contribution has proven timeless.
Next time you're at a coat check, remember: There's about a 36.8% chance no one gets their own coat back! Want to discuss probability history? Hit me up on X.