16  Martingales

Published

February 13, 2026

16.1 Definitions

Definition 16.1 (Filtration) A collection of sigma algebras (\mathcal F_i)_{i \in T} indexed by some set T \subset \mathbb R such that \mathcal F_s \subset \mathcal F_t for s< t \in I.

Definition 16.2 (Adaptation) A sequence (X_i) is adapted to (\mathcal F_i) if X_i \in \mathcal F_i for all i \in T.

Definition 16.3 (Natural filtration) For some given stochastic process (X_t)_{t \in T}, the natural filtration is:

\mathcal F_t = \sigma \{ X_s : s \leq t\}

Definition 16.4 (Martingale) Say we have a stochastic process (X_t) adapted to some filtration (\mathcal F_t).

We say the process is a martingale if for all s > t \in T, \mathbb E[X_s | \mathcal F_t] = X_t

A submartingale if for all s > t, \mathbb E[X_s | \mathcal F_t] \geq X_t

and a supermartingale if

\mathbb E[X_s | \mathcal F_t] \leq X_t

For discrete time space, the equivalent condition is to check \mathbb E[X_{t+1} | \mathcal F_t] for all t.

Definition 16.5 A process (H_n) is predictable wrt a filtration (\mathcal F_n) if H_n \in \mathcal F_{n-1} for all n.

Definition 16.6 (Betting on a martingale) Let (X_n) (stock price) be a adapted and (H_n) (number of shares held between n-1 and n) be predictable wrt (\mathcal F_n). Define the (profit) process Y_n = \sum_{i \leq n} H_i \cdot (X_i - X_{i-1})

Then (Y_n) is adapted to (\mathcal F_n)

Lemma 16.1 If (X_n) is a martingale, then so is (Y_n).

If (X_n) is a submartingale and H_n \geq 0 for all n, then (Y_n) is a submartingale.

Proof

\begin{align*} Y_{n+1} &= Y_n + H_{n+1} (X_{n+1} - X_{n})\\ \mathbb E[Y_{n+1}|\mathcal F_n] &= Y_n + H_{n+1} \mathbb E[ (X_{n+1} - X_{n}) |\mathcal F_n]\\ &= Y_n + H_{n+1} (\mathbb E[ X_{n+1} |\mathcal F_n]- X_{n}) \end{align*}

16.1.1 Doob martingale

Given an integrable random variable X and a filtration (\mathcal F_n), the Doob martingale is defined as

X_n = \mathbb E[X | \mathcal F_n ]

16.1.2 Doob’s decomposition

Any submartingale X_n, n \geq 0 can be written in a unique way as X_n = M_n + A_n where M_n is a martingale and A_n an increasing predictable sequence, with A_0= 0.

16.2 Convergence

16.2.1 Martingale with bounded increments

If X_n is a martingale such that |X_n - X_{n-1}| \leq M < \infty almosts surely, then eith er \lim X_n exists and is finite or \lim \inf X_n = -\infty and \lim \sup X_n = \infty.

16.2.2 Bounded martingale convergence theorem

If (X_n) is a submartingale and \mathbb E[X_n^+] is bounded above i.e. \sup_{n \in T} \mathbb E[X_n^+] < \infty then there exists an integrable random variable X_\infty with \mathbb E X_\infty < \infty such that X_n \stackrel{a.s.}{\to} X_\infty

Similarly, for a supermartingale (X_n), if \mathbb E[X_n^-] is bounded then it converges almost surely. A special case of this is that non negative supermartingales (and thus martingales) converge almost surely to a finite limit.

Proof

We prove for the supermatingale case. Let R = \lim \inf X_n and S= \lim \sup X_n. We first verify that R \stackrel{a.s.}{=} S, for which we need to rule out oscillating tail behaviour.

If there exist a< b, such that R < a < b < s, then U_\infty (a, b) = \infty.

We know by Lemma 16.2 and by the assumption that the expected negative part is bounded:

\begin{align*} \mathbb E U_n(a,b) &\leq \frac{\mathbb E[(X_n-a)^-]}{b-a} \leq M(a, b)\\ \mathbb E U_\infty(a,b) &\leq M(a, b) \qquad \text{monotone convergence}\\ &\implies \mathbb P(\{ U_\infty(a,b) = \infty\}) = 0\\ &\implies \mathbb P(B_{(a,b)} ) = 0 \end{align*}

Now this is true for any fixed a and b. Now R<S \iff \exists a, b \in \mathbb Q ; R< a< b < S.

\mathbb P(R < S) = \mathbb P( \cup_{a, b \in \mathbb Q} B_{(a,b)}) \leq \sum_{a, b \in \mathbb Q} \mathbb P( B_{(a,b)}) = 0

Hence R \stackrel{a.s.}{=} S so there exists a limiting random variable.

Now to rule out it doesn’t go to \infty, we can use Fatou’s lemma, which states that for a sequence of non-negative RVs Y_n

\begin{align*} \mathbb E[\lim \inf Y_n ] &\leq \lim \inf \mathbb E[ Y_n]\\ \mathbb E[\lim \inf X_n^-] &\leq \lim \inf \mathbb E[ X_n^-] \qquad \text{which is bounded} \\ \mathbb E[\lim \inf X_\infty^-] &\leq c \end{align*}

Now

\begin{align*} X_n &= X_n^+ - X_n^-\\ X_n^+ &= X_n + X_n^-\\ \mathbb E [ X_n^+] &= \mathbb E[(X_n + X_n^-)] \\ &\leq \mathbb E[X_0] + \sup_n \mathbb E X_n^- \\ \mathbb E[X_\infty^+] &= \mathbb E[\lim X_n^+] = \mathbb E[\lim \inf X_n^+] \leq \lim \inf \mathbb E[X_n^+] < \infty\\ \end{align*}

Thus \mathbb E[|X_\infty|]| = \mathbb E[X_\infty^+] + \mathbb E[X_\infty^-] < \infty.

Definition 16.7 (Upcrossing) The number of complete trips the price makes from a lower threshold to an upper one.

U_{n}^X (a,b) = \sup \{m: \exists s_1 < t_1 \dots s_m < t_m \leq n, X_{s_i} \leq a, X_{t_i} \geq b \}

To ensure uniqueness, we define s_i to be first time step after s_{i-1} where X_i \leq a and similarly for t_i.

Lemma 16.2 (Doob’s upcrossing lemma) If X_n is a supermartingale, and a < b then \mathbb E[U_n(a,b) ] \leq \frac{\mathbb E[(X_n-a)^-]}{b-a}

Proof

Let Y_n be the betting profit process. Z_n = \sum_{i \leq n} H_i (X_i - X_{i-1}) where H_n \in \{0, 1\}. Then Y_n is also a supermartingale and \mathbb E[Y_n] \leq 0 since Y_0 = 0 by convention.

Now observe that the profit is bounded below by the number of upcrossings times width, minus the final loss (if there is in fact a loss in the end).

\begin{align*} Y_n &\geq (b-a) U_n(a,b) - (X_n - a)^- \\ (b-a) \mathbb E [U_n(a,b)] - \mathbb E[(X_n - a)^-] &\leq \mathbb E[Y_n] \leq 0 \\ \end{align*}

16.2.3 Doob’s maximal inequality

Let (X_n) be a submartingale. Define M_n = \max_{i \leq n} X_i Then for any \lambda > 0,

\mathbb P(M_n \geq \lambda) \leq \frac{1}{\lambda} \mathbb E[X_n \mathbb I(M_n > \lambda)] \leq \frac{1}{\lambda} \mathbb E[X_n^+]

Proof

Let \tau = n \wedge\inf\{k: X_k \geq \lambda\} be the bounded stopping time when the process first exceeds \lambda.

We have

\begin{align*} \mathbb E[X_\tau] &\leq \mathbb E[X_n]\\ \mathbb E[X_\tau] &= \mathbb E[X_\tau \mathbb I(M_n \geq \lambda)] + \mathbb E[X_\tau \mathbb I(M_n < \lambda)]\\ \mathbb E[X_n] &= \mathbb E[X_n \mathbb I(M_n \geq \lambda)] + \mathbb E[X_n \mathbb I(M_n < \lambda)]\\ \mathbb E[X_\tau \mathbb I(M_n < \lambda)] &= \mathbb E[X_n \mathbb I(M_n < \lambda)]\qquad \text{if } M_n < \lambda \text{ then } \tau = n\\ \implies \mathbb E[X_\tau \mathbb I(M_n \geq \lambda)] &\leq \mathbb E[X_n \mathbb I(M_n \geq \lambda)]\\\\ X_\tau \geq \lambda \\\implies \mathbb E[X_\tau \mathbb I(M_n \geq \lambda)] &\geq \mathbb E[\lambda \mathbb I(M_n \geq \lambda)]= \lambda \mathbb P(M_n \geq \lambda)\\ \lambda \mathbb P(M_n \geq \lambda) &\leq \mathbb E[X_\tau \mathbb I(M_n \geq \lambda)] \leq \mathbb E[X_n \mathbb I(M_n \geq \lambda)]\\ \mathbb P(M_n \geq \lambda) &\leq \frac{1}{\lambda} \mathbb E[X_n \mathbb I(M_n \geq \lambda)] \end{align*}

16.2.4 L^p martingale convergence theorem

If (X_n) is a submartingale and is bounded in L^p for some p >1 i.e. there exists finite c such that \sup_n \|X_n\|_p < c then there exists an integrable random variable X_\infty with \mathbb E |X_\infty|^p < \infty such that

X_n \stackrel{a.s.}{\to} X_\infty

and

\|X_n - X_\infty\|_p \to 0

i.e X_n \stackrel{L^p}{\to} X_\infty

16.2.5 L^1 convergence and Doob’s martingale

If X_n \stackrel{L^1}{\to}X_\infty for a martingale (X_n) then we can express it as Doob’s martingale for the limiting random variable X_n = \mathbb E[X_\infty | \mathcal F_n]

16.2.6 Doob’s L^p maximal inequality

If (X_n) is a submartingale, p > 1 and M_n = \max_{i \leq n} X_n, then

\|M_n\|_p \leq q \|X_n\|_p

where 1/p + 1/q =1

16.2.7 Uniform Integrability

For a submartingale (X_n), these are equivalent:

  • (X_n) is uniformly integrable i.e. \lim_{K \to \infty} \sup_n \mathbb E[|X_n| \mathbf 1_{|X_n| > K} ]= 0

  • There exists an integrable random variable X_\infty such that X_n \stackrel{L^1+as}{\to} X_\infty

  • Xn \stackrel{L^1}{\to} X_\infty for some integrable random variable X_\infty.

Additionally if it’s a martingale we can write it as a Doob’s martingale for the limiting rv.

Proof

Uniform integrebaility implies that \mathbb E|X_n| is bounded above. Thus by the bounded martingale convergence theorem, there exists an integrable random variable X_\infty such that X_n \stackrel{a.s.}{\to} X_\infty.

We can get L^1 convergence easily because almost sure convergence implies convergence in probability, and this combined with uniform integrability implies L^1 convergence.

16.3 Stopping times

Definition 16.8 (Stopping time) A random variable \tau : \Omega \to T \cup \{\infty\} is a stopping time wrt a filtration (\mathcal F_t) if for all t \in T, the set \{\tau \leq t\} \in \mathcal F_t.

Theorem 16.1 (Stopped martingale) If (X_n) is a submartingale wrt (\mathcal F_n) and \tau is a stopping time, then the stopped process X_{n \wedge \tau} is also a submartingale wrt (\mathcal F_n).

Similarly for martingales and supermartingales.

Proof

Define the non-negative predictable process H_i = \mathbb I(i \leq \tau).

We can use it to define a betting process. X_{n\wedge \tau} = \sum H_i (X_i - X_{i-1}) + X_0\\

If (X_n) is a submartingale, then by Lemma 16.1, (X_{n \wedge \tau}) is also a submartingale.

16.3.1 Stopping time sigma-algebra

Definition 16.9 The stopping time sigma algebra \mathcal F_\tau is defined as \mathcal F_\tau = \{ B \in \mathcal F_\infty: \forall n, B \cap \{\tau \leq n\} \in \mathcal F_n\} where \mathcal F_\infty = \sigma (\cup_n \mathcal F_n)

Lemma 16.3 (Expectation at bounded stopping time) Let \tau be a bounded stopping time such that \mathbb P(\tau \leq m) = 1 for some finite m. If (X_n) is a submartingale then

\mathbb E[X_0] \leq \mathbb E[X_\tau] \leq \mathbb E[X_m]

Proof

We know Y_n = X_{n \wedge \tau} is a submartingale by Theorem 16.1.

\begin{align*} \mathbb E[Y_m|\mathcal F_0] &\geq \mathbb E[Y_0] \\ \implies \mathbb E[Y_m] &\geq \mathbb E[Y_0] \\ \implies \mathbb E[X_\tau] &\geq \mathbb E[X_0] \end{align*}

Now for n \geq \tau define the process Z_n = X_n - X_\tau = \sum_{i=1}^n 1_{[\tau+1, n]}(i) \cdot (X_i - X_{i-1}). By Lemma 16.1, (Z_n)_{n \geq \tau} is a submartingale. Thus, \mathbb E[Z_m] = \mathbb E[X_m - X_\tau] \geq \mathbb E[Z_\tau]= 0

16.3.2 Optional sampling theorem

If \tau and \sigma are bounded stopping times with \sigma \leq \tau almost surely, then for a submartingale (X_n),

\mathbb E[X_\tau | \mathcal F_\sigma] \geq X_\sigma

and similar for martingales and supermartingales.

16.4 Optional stopping

Saying things about unbounded stopping times.

Lemma 16.4 If (X_n) \in L^1 is an integrable martingale, N is an almost surely finite stopping time, \mathbb P(N < \infty)=1, and \mathbb E[X_n 1_{n < N}] \to 0 then \mathbb E[X_N] = \mathbb E[X_0]

Proof

For all n

\begin{align*} \mathbb E[X_0] &= \mathbb E[X_{n \wedge N}] \quad \text{bounded stopping time}\\ &= \mathbb E[X_{ N} 1_{ n \geq N}] + \mathbb E[X_{n} 1_{n <> N}]\\ &= \lim_{n\to \infty} \mathbb E[X_{ N} 1_{ n \geq N}] + \lim_{n\to \infty} \mathbb E[X_{n} 1_{n < N}]\\ &= \mathbb E[X_N] \qquad \text{MCT} \end{align*}

By assumption, the limit of the second term is zero.

16.4.1 Uniform integrability and stopping

If (X_n) is a uniformly integrable submartingale then for any stopping time N \leq \infty, the stopped martingale (X_{n\wedge N}) is also uniformly integrable, and X_N \in L^1.

Further \mathbb E[X_0] \leq \mathbb E[X_N] \leq \mathbb E[X_\infty]

If it is a martingale we also have

X_N = \mathbb E[X_\infty |\mathcal F_N]

X_0 = \mathbb E[X_N |\mathcal F_0]

Proof

(X_n) is a submartingale and thus so is (X_n)^+. (T \wedge n) \leq n \implies \mathbb E[X^+_{T \wedge n}]\leq \mathbb E[X^+_{n}] \implies \sup_n \mathbb E[X^+_{T \wedge n}]\leq \sup_n \mathbb E[X^+_{n}] < \infty.

Thus the stopped submartingale is bounded above in L^1 and thus converges almost surely to a finite limit. \lim_{n\to \infty}X_{T \wedge n} = X_T a.s and thus \mathbb E|X_{T}| < \infty.

For uniform integrability:

\begin{align*} \mathbb E[|X_{T \wedge n}|; |X_{T \wedge n}| > K ] &= \mathbb E[|X_{T \wedge n}|; |X_{T \wedge n}| > K, n \geq T ] + \mathbb E[|X_{T \wedge n}|; |X_{T \wedge n}| > K , n <T]\\ &\leq \mathbb E[|X_{T}|; |X_{T}| > K ] + \mathbb E[|X_{n}|; |X_{n}| > K]\\ \end{align*}

As we increase K the first term goes to zero by DCT. The limit of the supremum over n is zero by uniform integrability of (X_n).

Now since (T \wedge n) is a bounded stopping time for each n, we have \begin{align*} \mathbb E[X_{0}] &\leq \mathbb E[X_{n \wedge T}] \leq \mathbb E[X_{n}]\\ \mathbb E[X_{0}] &\leq \lim_{n \to \infty} \mathbb E[X_{n \wedge T}] \leq \lim_{n \to \infty} \mathbb E[X_{n}]\\ \mathbb E[X_{0}] &\leq \lim_{n \to \infty} \mathbb E[X_{n \wedge T}] \leq \mathbb E[X_{\infty}] \because X_n \stackrel{L^1}{\to} X_\infty \text{ by UI}\\ \mathbb E[X_{0}] &\leq \mathbb E[X_{T}] \leq \mathbb E[X_{\infty}] \because X_{n\wedge T} \stackrel{L^1}{\to} X_T \text{ by UI}\\ \end{align*}

To show X_N = \mathbb E[X_\infty |\mathcal F_N] we want X_N \in \mathcal F_n and \mathbb E[X_N 1_A] = \mathbb E[X_\infty 1_A] for all A \in \mathcal F_N

For measurability,

\begin{align*} \forall n, \{X_N \leq x \} \cap \{N = n\} = \{X_n \leq x \} \cap \{N = n\} &\in \mathcal F_n\\ \forall n, \{X_N \leq x \} \cap \{N \leq n\} &\in \mathcal F_n\\ \implies \{X_N \leq x \} &\in \mathcal F_N \end{align*}

For the projection property:

\begin{align*} \mathbb E[X_N 1_A] &= \mathbb E[\sum_{n = 1}^\infty X_N 1_{n = N} 1_A] + \mathbb E[X_N 1_{N = \infty} 1_A]\\ &= \sum_{n = 1}^\infty\mathbb E[ X_n 1_{n = N} 1_A] + \mathbb E[X_\infty 1_{N = \infty} 1_A] \quad \text{ DCT, } X_N \in L^1\\ &= \sum_{n = 1}^\infty\mathbb E[ X_\infty 1_{n = N} 1_A] + \mathbb E[X_\infty 1_{N = \infty} 1_A] \quad X_n = \mathbb E[ X_\infty| \mathcal F_n], A \cup \{N = n\} \in \mathcal F_n\\ &= \mathbb E[X_\infty 1_A] \quad \text{ DCT, } X_\infty \in L^1\\ \end{align*}

Now to prove \mathbb E[X_N | \mathcal F_0] = X_0

\begin{align*} \mathbb E[X_N| \mathcal F_0] &= \mathbb E[\mathbb E[X_\infty| \mathcal F_N] | \mathcal F_0]\\ &= \mathbb E[X_\infty | \mathcal F_0] = X_0 \end{align*}

16.4.2 Bounded increments

If (X_n) is a martingale with |X_n - X_{n-1}| < M < \infty and N is a stopping time with finite expectation \mathbb EN < \infty, then \mathbb E[X_N |\mathcal F_0] = X_0

Proof

\begin{align*} |X_n| &\leq |X_0| + n M\\ |X_N| &\leq |X_0| + N M = Y\\ X_0 \in L^1, \mathbb E[N] < \infty &\implies E[Y] < \infty\\ |X_{n \wedge N}| &\leq |X_0| + (N\wedge n) M \leq Y\\\\ \mathbb E[X_N|\mathcal F_0] &= \mathbb E[\lim X_{n \wedge N}| \mathcal F_0]\\ &= \lim \mathbb E[X_{n \wedge N}| \mathcal F_0] \text{ DCT} \\ &= X_0 \end{align*}