Unconditional convergence

In mathematics, specifically functional analysis, a series is unconditionally convergent if all reorderings of the series converge to the same value. In contrast, a series is conditionally convergent if it converges but different orderings do not all converge to that same value. Unconditional convergence is equivalent to absolute convergence in finite-dimensional vector spaces, but is a weaker property in infinite dimensions.

Definition
Let $$X$$ be a topological vector space. Let $$I$$ be an index set and $$x_i \in X$$ for all $$i \in I.$$

The series $$\textstyle \sum_{i \in I} x_i$$ is called unconditionally convergent to $$x \in X,$$ if
 * the indexing set $$I_0 := \left\{i \in I : x_i \neq 0\right\}$$ is countable, and
 * for every permutation (bijection) $$\sigma : I_0 \to I_0$$ of $$I_0 = \left\{i_k\right\}_{k=1}^\infty$$ the following relation holds: $$\sum_{k=1}^\infty x_{\sigma\left(i_k\right)} = x.$$

Alternative definition
Unconditional convergence is often defined in an equivalent way: A series is unconditionally convergent if for every sequence $$\left(\varepsilon_n\right)_{n=1}^\infty,$$ with $$\varepsilon_n \in \{-1, +1\},$$ the series $$\sum_{n=1}^\infty \varepsilon_n x_n$$ converges.

If $$X$$ is a Banach space, every absolutely convergent series is unconditionally convergent, but the converse implication does not hold in general. Indeed, if $$X$$ is an infinite-dimensional Banach space, then by Dvoretzky–Rogers theorem there always exists an unconditionally convergent series in this space that is not absolutely convergent. However, when $$X = \R^n,$$ by the Riemann series theorem, the series $\sum_n x_n$ is unconditionally convergent if and only if it is absolutely convergent.