What Does It Mean for a Sequence to Be Bounded?

A sequence is bounded when all of its terms stay within some fixed range of numbers, never shooting off toward infinity or negative infinity. More precisely, a sequence is bounded if there exists a real number M such that the absolute value of every term is less than or equal to M. This single condition guarantees that the sequence never grows arbitrarily large in either direction.

The Formal Definition

Boundedness actually breaks into three related ideas. A sequence is bounded above if there’s some real number M where every term stays at or below M. It’s bounded below if there’s some real number m where every term stay at or above m. A sequence is bounded when both conditions hold simultaneously.

There’s a cleaner equivalent: a sequence (sₙ) is bounded if you can find a single number M ≥ 0 such that |sₙ| ≤ M for every natural number n. Writing it with the absolute value automatically captures both directions at once, since |sₙ| ≤ M is the same as saying −M ≤ sₙ ≤ M. Think of it as drawing a horizontal band on a graph. If every point of the sequence fits inside that band, the sequence is bounded.

The number M doesn’t have to be tight. If a sequence never exceeds 5, you could use M = 5, M = 100, or M = 1,000,000. What matters is that at least one such M exists. If no finite M works because the terms keep growing without limit, the sequence is unbounded.

Simple Examples

The sequence 1/n (which produces 1, 1/2, 1/3, 1/4, …) is bounded. Every term is positive, so 0 serves as a lower bound, and 1 serves as an upper bound. You could pick M = 1, and |1/n| ≤ 1 holds for every n.

The sequence (−1)ⁿ, which alternates between −1 and 1, is also bounded. It’s bounded above by 1 and bounded below by −1, so M = 1 works. This example is worth remembering because, despite being bounded, it diverges. It keeps flipping between two values and never settles down toward a single number.

The sequence n (which produces 1, 2, 3, 4, …) is unbounded. No matter what M you choose, eventually n will exceed it. The same goes for (−1)ⁿ · n, which swings between increasingly large positive and negative values.

Supremum and Infimum

When a sequence is bounded, you can ask: what’s the tightest possible upper bound? That’s the supremum (or least upper bound) of the sequence’s range. Similarly, the tightest lower bound is the infimum (or greatest lower bound). For the sequence (−1)ⁿ, the supremum is 1 and the infimum is −1, and both are actually achieved by terms in the sequence.

Sometimes the supremum isn’t reached by any term. The sequence 1 − 1/n (producing 0, 1/2, 2/3, 3/4, …) has a supremum of 1, but no term ever equals 1. The values get arbitrarily close without arriving. The infimum is 0, which is reached by the first term. If both the supremum and infimum exist as finite numbers, the sequence is bounded, and every term sits between them.

Boundedness and Convergence

One of the most important facts in analysis is that every convergent sequence is bounded. The logic is intuitive: if a sequence approaches some limit L, then eventually all the terms are clustered near L. The finitely many “early” terms that haven’t settled down yet are just a finite collection of numbers, which is automatically bounded. Combine the early terms with the cluster near L, and the whole sequence fits inside some finite band.

The reverse is not true. Boundedness alone does not guarantee convergence. The sequence (−1)ⁿ is the classic counterexample: every term sits between −1 and 1, yet the sequence bounces back and forth forever without approaching any single value. So boundedness is a necessary condition for convergence but not a sufficient one. You need something more.

When Boundedness Does Guarantee Convergence

Add one extra ingredient, monotonicity, and boundedness becomes sufficient. The Monotone Convergence Theorem states that if a sequence is both monotone (either always increasing or always decreasing) and bounded, then it converges. A monotone increasing bounded sequence converges to the supremum of its terms, and a monotone decreasing bounded sequence converges to the infimum. This theorem is a workhorse in calculus and analysis because it lets you prove a limit exists without ever computing what the limit is. You just show the sequence goes in one direction and can’t go past a certain point.

Even without monotonicity, bounded sequences have a powerful property. The Bolzano-Weierstrass Theorem says that every bounded sequence of real numbers has a convergent subsequence. The full sequence might diverge (like (−1)ⁿ), but you can always extract some infinite subset of terms that does converge. For (−1)ⁿ, the subsequence of even-indexed terms (1, 1, 1, …) converges to 1, and the odd-indexed terms (−1, −1, −1, …) converge to −1.

Cauchy Sequences and Boundedness

A Cauchy sequence is one where the terms get arbitrarily close to each other as you go further out. Every Cauchy sequence is bounded, for essentially the same reason convergent sequences are: eventually the terms cluster together, and the finitely many early terms can’t ruin boundedness. In the real numbers, a sequence is Cauchy if and only if it converges, so for real-valued sequences “Cauchy” and “convergent” are interchangeable. Both imply bounded.

A bounded sequence that diverges must do so because it has at least two distinct subsequences approaching different limits. The (−1)ⁿ example illustrates this perfectly: one subsequence heads to 1, another to −1, and the sequence as a whole can’t commit to either.

Why Boundedness Matters

Boundedness shows up constantly as a hypothesis in theorems throughout calculus and real analysis. When you’re working with infinite series, testing for convergence often involves checking whether the partial sums form a bounded sequence. In proofs about continuity, integration, and function approximation, bounded sequences provide the controlled behavior needed to extract limits and guarantee that calculations don’t blow up. Understanding the definition is the first step, but recognizing how boundedness interacts with monotonicity, convergence, and subsequences is where it becomes genuinely useful.