# Conditional convergence

Jump to navigation Jump to search

In mathematics, a series or integral is said to be conditionally convergent if it converges, but it does not converge absolutely.

## Definition

More precisely, a series ${\displaystyle \scriptstyle \sum \limits _{n=0}^{\infty }a_{n}}$ is said to converge conditionally if ${\displaystyle \scriptstyle \lim \limits _{m\rightarrow \infty }\,\sum \limits _{n=0}^{m}\,a_{n}}$ exists and is a finite number (not ∞ or −∞), but ${\displaystyle \scriptstyle \sum \limits _{n=0}^{\infty }\left|a_{n}\right|=\infty .}$

A classic example is given by

${\displaystyle 1-{1 \over 2}+{1 \over 3}-{1 \over 4}+{1 \over 5}-\cdots =\sum \limits _{n=1}^{\infty }{(-1)^{n+1} \over n}}$

which converges to ${\displaystyle \ln(2)\,\!}$, but is not absolutely convergent (see Harmonic series).

The simplest examples of conditionally convergent series (including the one above) are the alternating series.

Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any sum at all, including ∞ or −∞; see Riemann series theorem.

A typical conditionally convergent integral is that on the non-negative real axis of ${\displaystyle \sin(x^{2})}$.

## References

• Walter Rudin, Principles of Mathematical Analysis (McGraw-Hill: New York, 1964).