Montmort's Problem: The Classic Matching Problem (1708)

πŸ“… March 17, 2026 πŸ‘€ syscraft906 πŸ“– 15 min read
← Back to Blog

Introduction: A Problem 300+ Years Old

Imagine you're checking coats at a fancy restaurant. You have 10 coats and 10 coat check numbers. At the end of the night, you randomly return the coats to customers. What's the probability that no one gets their own coat back?

This is Montmort's famous matching problem, first studied by Pierre RΓ©mond de Montmort in 1708. It's one of the oldest and most elegant problems in probability theory, and the answer is surprisingly elegant.

In this post, we'll explore the problem, solve it, and see why it matters today.

Historical Context: Pierre RΓ©mond de Montmort

Pierre RΓ©mond de Montmort (1678-1719) was a French mathematician who published Essay d'analyse sur les jeux de hazard (Essay on the Analysis of Games of Chance) in 1708. This work is considered one of the founding texts of probability theory.

Montmort was part of the early wave of mathematicians (alongside Pascal, Fermat, Huygens) who were formalizing probability. His essay contained many problems about card games and probability, including the matching problem, also known as the problème des rencontres (problem of coincidences).

The Problem Statement

The Matching Problem: A person shuffles n letters into n envelopes randomly. What is the probability that none of the letters are in the correct envelope?

This is equivalent to asking: What's the probability of a complete derangement?

A derangement is a permutation where no element appears in its original position. The number of derangements of n objects is denoted D(n) or !n (subfactorial).

Small Examples

Case: n = 2 (2 letters, 2 envelopes)

Possible arrangements:

P(derangement) = 1/2 = 50%

Case: n = 3 (3 letters, 3 envelopes)

Total arrangements: 3! = 6

Derangements:

P(derangement) = 2/6 = 1/3 β‰ˆ 33.3%

Case: n = 4 (4 letters, 4 envelopes)

Total arrangements: 4! = 24

Derangements: 9

P(derangement) = 9/24 = 3/8 = 37.5%

The Formula: Subfactorial

The number of derangements of n objects is:

D(n) = n! Γ— Ξ£((-1)^k / k!) for k = 0 to n

Or more compactly:

D(n) = n! Γ— (1 - 1/1! + 1/2! - 1/3! + ... + (-1)^n/n!)

This is called the subfactorial and is denoted !n.

Why This Formula?

This formula comes from the inclusion-exclusion principle:

The Probability Formula

The probability that all n letters are in wrong envelopes:

P(derangement) = D(n) / n! = 1 - 1/1! + 1/2! - 1/3! + ... + (-1)^n/n!

Surprising Result: As n β†’ ∞

Here's the amazing part. As n gets large:

P(derangement) β†’ 1/e β‰ˆ 0.36788

Where e β‰ˆ 2.71828 is Euler's number (the base of natural logarithms).

This means that no matter how many letters and envelopes you have, the probability that none match approaches roughly **36.8%** β€” and it converges very quickly!

Table: Probabilities by n

n (letters) D(n) (derangements) n! (total) P(derangement) % Probability
1 0 1 0 0%
2 1 2 0.5 50.0%
3 2 6 0.333 33.3%
4 9 24 0.375 37.5%
5 44 120 0.367 36.7%
10 1,334,961 3,628,800 0.3679 36.79%
20 5.9 Γ— 10^18 2.4 Γ— 10^18 0.367879 36.7879%
∞ β€” β€” 1/e 36.7879%

Notice how quickly it converges to 1/e!

Why Does It Converge to 1/e?

The Taylor series expansion of e^(-1) is:

e^(-1) = 1/e = 1 - 1/1! + 1/2! - 1/3! + 1/4! - ...

But our derangement probability is:

P(derangement) = 1 - 1/1! + 1/2! - 1/3! + ... + (-1)^n/n!

For large n, we're computing the first n+1 terms of the Taylor series for e^(-1). As n β†’ ∞, we get the full series, which equals 1/e exactly!

Python Implementation

import math
from math import factorial

def derangements(n):
    """Calculate number of derangements of n objects"""
    if n == 0:
        return 1
    if n == 1:
        return 0
    
    result = 0
    for k in range(n + 1):
        result += factorial(n) * ((-1) ** k) / factorial(k)
    return int(result)

def derangement_probability(n):
    """Calculate probability of complete derangement"""
    d = derangements(n)
    return d / factorial(n)

# Test
for n in [2, 3, 4, 5, 10, 20, 100]:
    prob = derangement_probability(n)
    print(f"n={n:3d}: P(derangement) = {prob:.6f} ({prob*100:.4f}%)")

# Compare with 1/e
one_over_e = 1 / math.e
print(f"\n1/e = {one_over_e:.6f}")

# Output:
# n=  2: P(derangement) = 0.500000 (50.0000%)
# n=  3: P(derangement) = 0.333333 (33.3333%)
# n=  4: P(derangement) = 0.375000 (37.5000%)
# n=  5: P(derangement) = 0.366667 (36.6667%)
# n= 10: P(derangement) = 0.367857 (36.7857%)
# n= 20: P(derangement) = 0.367879 (36.7879%)
# n=100: P(derangement) = 0.367879 (36.7879%)
# 
# 1/e = 0.367879

Real-World Applications

1. Coat Check Problem (Original)

Restaurant coats randomly returned to customers. Probability no one gets their own coat: ~36.8%.

2. Secret Santa

In a Secret Santa gift exchange, you want everyone to get someone else's gift. Probability of success with random assignment: ~36.8%.

3. Cryptography

Derangements are used in analyzing the security of permutation ciphers and analyzing randomness in cryptographic systems.

4. Quality Control

When inspecting n items, what's the probability that inspector marks are completely wrong about original labels?

5. DNA Sequencing

Biologists use derangements when analyzing random mutations where no position retains its original genetic marker.

6. Job Assignment

Company has n employees and n jobs. Random assignment results in no one getting their preferred job: ~36.8% probability.

The Beautiful Connection to e

Key insight: The derangement problem connects three seemingly unrelated mathematical concepts:
  • Combinatorics: Counting permutations with constraints
  • Probability: Finding the probability of a specific outcome
  • Analysis: The Taylor series expansion of e^(-1)
This is an example of how deep mathematical truths connect across different fields.

Montmort's Legacy

Montmort's work on this problem:

Fun Fact: Quick Approximation

For large n: The number of derangements is approximately n!/e.

So if you have 100 letters:

D(100) β‰ˆ 100! / e β‰ˆ 100! / 2.71828

Conclusion

Montmort's matching problem is a beautiful example of classical probability. It's simple to state, elegant to solve, and connects to deep mathematics (the number e!). The fact that the probability converges to 1/e β‰ˆ 36.8% regardless of the size shows the power of mathematical patterns.

Over 300 years later, derangements still appear in cryptography, algorithms, and computer science. Montmort's contribution has proven timeless.

Next time you're at a coat check, remember: There's about a 36.8% chance no one gets their own coat back! Want to discuss probability history? Hit me up on X.