Staircase paradox

In mathematical analysis, the staircase paradox is a pathological example showing that limits of curves do not necessarily preserve their length. It consists of a sequence of "staircase" polygonal chains in a unit square, formed from horizontal and vertical line segments of decreasing length, so that these staircases converge uniformly to the diagonal of the square. However, each staircase has length two, while the length of the diagonal is the square root of 2, so the sequence of staircase lengths does not converge to the length of the diagonal. Martin Gardner calls this "an ancient geometrical paradox". It shows that, for curves under uniform convergence, the length of a curve is not a continuous function of the curve.

For any smooth curve, polygonal chains with segment lengths decreasing to zero, connecting consecutive vertices along the curve, always converge to the arc length. The failure of the staircase curves to converge to the correct length can be explained by the fact that some of their vertices do not lie on the diagonal. In higher dimensions, the Schwarz lantern provides an analogous example showing that polyhedral surfaces that converge pointwise to a curved surface do not necessarily converge to its area, even when the vertices all lie on the surface.

As well as highlighting the need for careful definitions of arc length in mathematics education, the paradox has applications in digital geometry, where it motivates methods of estimating the perimeter of pixelated shapes that do not merely sum the lengths of boundaries between pixels.