Introduction
Imagine you have a complicated function like sin(x) or e^x. Computing these exactly requires advanced mathematical techniques. But what if you could approximate them using simple polynomials?
That's the power of Taylor series — a revolutionary technique in calculus that lets you express almost any smooth function as an infinite polynomial. This simple idea has changed mathematics, physics, engineering, and computer science.
In this post, we'll explore what Taylor series are, how to derive them, and why they matter.
The Core Idea
The main insight: If you know a function's value and all its derivatives at a single point, you can reconstruct the entire function using a polynomial.
This is the foundation of Taylor series.
The Taylor Series Formula
For a function f(x) that's infinitely differentiable at a point a, the Taylor series expansion around a is:
Or more compactly:
Where:
f^(n)(a)is the n-th derivative of f evaluated at x = an!is n factorial(x-a)^nis the power term
Special Case: Maclaurin Series
When a = 0 (expanding around the origin), the Taylor series becomes the Maclaurin series:
Maclaurin series are simpler because you only need derivatives at x = 0.
Intuition: Why Does It Work?
The idea is brilliant in its simplicity:
- Constant term: f(a) — the value at the expansion point
- Linear term: f'(a)(x-a) — approximates how the function changes near a
- Quadratic term: f''(a)(x-a)²/2! — captures curvature
- Higher terms: Capture increasingly fine details
By including more and more terms, you get better and better approximations.
Examples: Common Taylor Series
1. Exponential Function: e^x
2. Sine Function: sin(x)
3. Cosine Function: cos(x)
4. Natural Logarithm: ln(1+x)
5. Geometric Series: 1/(1-x)
Computing a Taylor Series: Step by Step
Let's compute the Taylor series for f(x) = e^x around a = 0:
f(x) = e^x → f(0) = 1
f'(x) = e^x → f'(0) = 1
f''(x) = e^x → f''(0) = 1
f'''(x) = e^x → f'''(0) = 1
...
Step 2: Apply formula
e^x = 1 + 1·x + 1·x²/2! + 1·x³/3! + ...
= 1 + x + x²/2! + x³/3! + ...
Key insight: All derivatives of e^x are e^x itself!
Practical Example: Approximating sin(x)
Let's use Taylor series to approximate sin(π/6) ≈ 0.5:
x = π/6 ≈ 0.5236
0 terms: sin(x) ≈ 0
1 term: sin(x) ≈ x ≈ 0.5236
2 terms: sin(x) ≈ x - x³/6 ≈ 0.4998
3 terms: sin(x) ≈ x - x³/6 + x⁵/120 ≈ 0.5000
Exact: sin(π/6) = 0.5
Notice how each term gets us closer to the true value!
Python Implementation
import math
from math import factorial
def taylor_sin(x, n_terms):
"""Approximate sin(x) using Taylor series with n terms"""
result = 0
for n in range(n_terms):
term = ((-1) ** n) * (x ** (2*n + 1)) / factorial(2*n + 1)
result += term
return result
def taylor_cos(x, n_terms):
"""Approximate cos(x) using Taylor series with n terms"""
result = 0
for n in range(n_terms):
term = ((-1) ** n) * (x ** (2*n)) / factorial(2*n)
result += term
return result
def taylor_exp(x, n_terms):
"""Approximate e^x using Taylor series with n terms"""
result = 0
for n in range(n_terms):
term = (x ** n) / factorial(n)
result += term
return result
# Test
x = math.pi / 6 # π/6 radians
print("Approximating sin(π/6):")
for n in [1, 2, 3, 5, 10]:
approx = taylor_sin(x, n)
exact = math.sin(x)
error = abs(approx - exact)
print(f" {n:2d} terms: {approx:.6f}, error: {error:.2e}")
print(f"\nExact: {math.sin(x):.6f}")
# Output:
# Approximating sin(π/6):
# 1 terms: 0.523599, error: 2.36e-02
# 2 terms: 0.499840, error: 1.60e-04
# 3 terms: 0.500005, error: 4.54e-06
# 5 terms: 0.500000, error: 2.57e-11
# 10 terms: 0.500000, error: 0.00e+00
#
# Exact: 0.500000
Convergence: When Does It Work?
Taylor series don't always converge everywhere. The radius of convergence is the region where the series works.
Examples:
- e^x: Converges for all x (radius = ∞)
- sin(x), cos(x): Converge for all x (radius = ∞)
- ln(1+x): Converges for |x| < 1
- 1/(1-x): Converges for |x| < 1
The radius of convergence depends on singularities (points where the function is undefined).
Why Taylor Series Matter
1. Computing Difficult Functions
Computers can't directly compute sin(x) or ln(x). Instead, they use Taylor series to approximate these functions to required precision.
2. Calculus and Analysis
Taylor series bridge discrete approximations and continuous analysis. They're fundamental to numerical analysis.
3. Physics and Engineering
Many physical laws lead to differential equations. Taylor series solve them numerically.
4. Machine Learning
Gradient descent (used in all neural networks) is based on Taylor series approximations of loss functions.
5. Signal Processing
Fourier series (a cousin of Taylor series) are fundamental to digital signal processing.
Taylor vs. Fourier Series
Taylor Series: Approximates using polynomials (powers of x), good for single-point analysis
Fourier Series: Approximates using sines and cosines, good for periodic functions
Error Estimation: How Close Is Our Approximation?
The Lagrange remainder formula tells us the error when truncating a Taylor series:
Where M is an upper bound on |f^(n+1)| between a and x. This tells you how many terms you need for desired accuracy.
Key Insights
- Taylor series express functions as infinite polynomials
- They require knowing all derivatives at one point
- Maclaurin series are Taylor series centered at x = 0
- They have a radius of convergence (where they work)
- Approximations improve with more terms
- Fundamental to numerical computing and analysis
Historical Context
Taylor series were developed by Brook Taylor in the early 1700s (first published in 1715). The Maclaurin series, a special case, was popularized by Colin Maclaurin. Though Newton and Gregory had discovered similar ideas earlier, Taylor's general formulation became foundational to calculus.
Conclusion
Taylor series transform complicated functions into manageable polynomials. This simple idea has proven to be one of the most powerful techniques in mathematics, with applications spanning from pure math to engineering to machine learning.
Understanding Taylor series deepens your appreciation for how mathematical approximations work — and why your computer can calculate sin(x) so quickly and accurately.