What Is a Divergent Sequence? Definition and Types

A divergent sequence is a sequence of numbers that does not settle toward a single finite value as it continues forever. In more precise terms, a sequence diverges when its limit either doesn’t exist or is infinite. This is the opposite of a convergent sequence, which gets closer and closer to one specific number. Understanding divergence is one of the foundational ideas in calculus and real analysis, and it shows up in several distinct forms.

The Core Idea Behind Divergence

A convergent sequence has a clean definition: the sequence a_n converges to a limit L if, for any tiny distance you pick (called epsilon), there’s a point in the sequence after which every term stays within that distance of L. No matter how small you make that window, eventually the sequence stays inside it permanently.

A divergent sequence simply fails this test. Either the terms keep growing without bound, they keep shrinking toward negative infinity, or they bounce around in a way that never locks onto a single value. The limit either equals infinity, equals negative infinity, or doesn’t exist at all.

Divergence to Infinity

The most straightforward type of divergence happens when terms grow larger and larger without any ceiling. The sequence {1, 2, 3, 4, 5, …} diverges because no matter what target number you pick, the sequence eventually blows past it and never comes back. The sequence {2^n}, which goes 2, 4, 8, 16, 32, …, diverges in the same way but much faster.

Sequences can also diverge to negative infinity. The sequence {-n}, which goes -1, -2, -3, -4, …, drops below any negative number you choose if you wait long enough. Mathematicians sometimes say a sequence “diverges to infinity” or “diverges to negative infinity” to be specific about the direction, even though the sequence still counts as divergent in both cases. The key point is that infinity is not a finite number, so even when a sequence heads steadily in one direction, it has no actual limit to converge to.

Divergence by Oscillation

Not every divergent sequence shoots off toward infinity. Some bounce back and forth forever without settling down. The classic example is {(-1)^n}, which produces the pattern -1, 1, -1, 1, -1, 1, … repeating endlessly. This sequence never grows large, but it also never approaches a single value. It’s always exactly 1 or exactly -1, so it can’t get arbitrarily close to any one number.

Another example is {sin(nπ/2)}, which cycles through the values 1, 0, -1, 0, 1, 0, -1, 0, … in a repeating pattern. Again, the terms stay bounded between -1 and 1, but they never zero in on a single target. This type of behavior is called oscillatory divergence, and it’s an important reminder that divergence doesn’t require a sequence to be unbounded. It only requires the sequence to fail the convergence test.

Oscillatory divergence can also be unbounded. A sequence like {(-1)^n · n}, which produces -1, 2, -3, 4, -5, 6, …, both oscillates in sign and grows in size. It swings between increasingly large positive and negative values, so it neither converges nor diverges neatly to positive or negative infinity.

Geometric and Arithmetic Sequences

Two of the most common sequence types in math courses have simple rules for when they diverge.

An arithmetic sequence adds the same number (the common difference) each time: 3, 7, 11, 15, 19, … for example. If that common difference is anything other than zero, the terms will eventually grow without bound in one direction, so the sequence diverges. Only when the common difference is zero (meaning every term is the same number) does an arithmetic sequence converge.

A geometric sequence multiplies by the same number (the common ratio) each time: 3, 6, 12, 24, … for a ratio of 2. The convergence rule depends on the size of that ratio. If the absolute value of the common ratio is less than 1, such as 1/2 or -0.3, the terms shrink toward zero and the sequence converges. If the absolute value equals or exceeds 1, the sequence diverges. A ratio of exactly -1 gives the oscillating pattern {a, -a, a, -a, …}, which diverges by oscillation. A ratio greater than 1 sends the sequence to infinity. A ratio of exactly 1 keeps every term the same, which technically converges (to that constant value).

Bounded Divergent Sequences and Subsequences

One subtle point that trips up many students: a sequence can be bounded (all its terms stay between two fixed numbers) and still diverge. The sequence {(-1)^n} stays between -1 and 1 forever, yet it diverges. Being bounded is necessary for convergence but not sufficient on its own.

However, bounded divergent sequences have an interesting property. The Bolzano-Weierstrass theorem guarantees that any bounded sequence, even a divergent one, contains at least one convergent subsequence. A subsequence is just a selection of terms from the original sequence, taken in order but skipping some. For {(-1)^n}, you could pick out just the even-indexed terms (1, 1, 1, 1, …) to get a subsequence that converges to 1, or pick the odd-indexed terms (-1, -1, -1, -1, …) to get one that converges to -1. The full sequence doesn’t converge, but pieces of it do.

This property breaks down for unbounded sequences. The sequence {n} = 1, 2, 3, 4, … has no convergent subsequence because every subsequence also grows to infinity.

Cauchy Sequences and Completeness

There’s another way to think about convergence that connects to divergence. A Cauchy sequence is one where the terms get closer and closer to each other as you go further out. In the real number system, every Cauchy sequence converges, and every convergent sequence is Cauchy. This means that if a sequence of real numbers is not Cauchy (its terms don’t consistently tighten up), it must be divergent.

This equivalence between Cauchy sequences and convergent sequences is a property called completeness. The real numbers are complete, meaning there are no “gaps” for a well-behaved sequence to fall through. The rational numbers, by contrast, are not complete. You can build a sequence of fractions that gets closer and closer to the square root of 2, which is a Cauchy sequence in the rationals, but it doesn’t converge to any rational number. In that context, the sequence diverges within the rationals even though it would converge within the reals. This distinction matters in higher math courses but illustrates an important idea: whether a sequence diverges can depend on the number system you’re working in.

How to Tell if a Sequence Diverges

In practice, you can check for divergence using a few approaches. The most direct is to compute the limit of the sequence’s terms as n goes to infinity. If that limit is infinite or doesn’t exist, the sequence diverges.

  • Direct computation: For a_n = n^3, the limit is infinity, so it diverges.
  • Oscillation check: For a_n = (-1)^n, the terms alternate and never approach one value, so it diverges.
  • Ratio test for geometric sequences: If the absolute value of the common ratio is 1 or greater (and not exactly 1 with a constant sequence), the geometric sequence diverges.
  • Divergence test for series: If the individual terms of a series don’t approach zero, the series must diverge. This is specifically about series (infinite sums), but it relies on the behavior of the underlying sequence of terms.

One important caution: the terms of a sequence approaching zero does not guarantee convergence of the related series. The harmonic series (1 + 1/2 + 1/3 + 1/4 + …) has terms that shrink to zero, but the series itself diverges. The divergence test only works in one direction: nonzero limits prove divergence, but zero limits don’t prove convergence.