Talk:Uniform convergence

Untitled
Thanks!Was a great help.... Stephan

Theorem assumption too strong?
In the section of theorems relating to the interchanging of limits, specifically the one pretaining to derivatives, I don't believe that uniform convergence of the original sequence of functions is necessary. It should be that if the sequence of functions {f_n(x)} converges pointwise and the sequence of derivatives {f_n'(x)} of the original sequence converges uniformly, then the original limit function f(x) = lim f_n(x) is differentiable and its derivative is equal to the limit of the sequence of the derivatives of the original sequence, f'(x) = lim f_n'(x). --anon


 * You are right. But I would prefer to leave it the way it is, it is more straightforward that way.


 * By the way, if the interval on which f_n are defined is bounded, it is enough for the sequence {f_n(x)} to be convergent just at one point. This together with the assumption that {f_n'(x)} converges uniformly, will imply that f_n converges uniformly to a function f and that f'(x) = lim f_n'(x). Oleg Alexandrov 02:37, 14 September 2005 (UTC)

Dirichlet or Seidel?
The article says:
 * Dirichlet then analyzed Cauchy's proof and found the mistake: the notion of pointwise convergence had to be replaced by uniform convergence.

On the other hand, infinite series says:
 * The theory of uniform convergence was treated by Cauchy (1821), his limitations being pointed out by Abel, but the first to attack it successfully were Stokes and Seidel (1847-48).

Imre Lakatos (Proofs and Refutations, p. 135) also credits Stokes and Seidel, citing the following papers Should this article be corrected? Gdr 22:26, 7 August 2006 (UTC)

Uniform convergence of the exponential function
I added in the example of uniform convergence of the exponential function, but now I'm not so sure that my argument was valid. Every time we change the radius of the disc D_R, we change the speed of convergence, because we're bounding all of the points by a different radius. I don't feel comfortable stating that the series is uniformly convergent for all z in the complex plane, at least based on the definition of uniform convergence in this article. What does anyone else think of this? —Preceding unsigned comment added by Bdforbes (talk • contribs) 02:14, 8 November 2008 (UTC)


 * For real numbers, the exponential function is uniformly convergent on any bounded interval [A, B], with A < B, which is the usual case. Now we can make the interval [A, B] as large as we please, as long as it is finite in length. So, don't let A be minus infinity, or B be plus infinity, and everything is fine.
 * So let's just consider B. This situation is a somewhat subtle. You can make B as large as you choose, but you can't let B be infinity. What's the difference? For practical purposes, there isn't any difference. When mathematician say that the interval of convergence is infinite, that is just a shorthand way of saying, "You can make the interval of convergence as large as you will ever need."
 * For complex numbers, the exponential function is uniformly convergent on any bounded disk of radius R,  with 0 < R. Now we can make the radius R as large as we please, as long as it is finite in length. So, don't let R be infinity, and everything is fine.
 * This situation is a somewhat subtle. You can make R as large as you choose, but you can't let R be infinity. What's the difference? For practical purposes, there isn't any difference. When mathematician say that the radius of convergence is infinite, that is just a shorthand way of saying, "You can make the radius of convergence as large as you will ever need."
 * The same thing applies for the functions sine(z) and cosine(z), but there comes a problem with tan(z) because this function has singularities in it, and it is not a continuous function.
 * Some other functions have extremely large radii of convergence, such a many of the Bessel functions. You really need to check out if the functions have singularities in them. For example, Log(z) has a very nasty singularity in it - a branch point - and it has a small radius of convergence. 98.67.106.59 (talk) 20:04, 4 August 2012 (UTC)

Image request
To someone with some graphics expertise: this article would benefit from an image of a uniform &epsilon;-neighborhood of a given function. (Just a "tube" of diameter 2&epsilon; around the graph of a function.) I would do it myself, but I'm pretty hopeless with graphics. Sławomir Biały (talk) 12:32, 18 October 2010 (UTC)

History Section
I wanted to add a bit to the history section to give a few more details. Also I want to rework it to remove weasel words such as "Some historians", but I thought I would state my intention on the talk page first. Thenub314 (talk) 00:44, 17 December 2011 (UTC)

Incorrect Statement in the Introductory Paragraph
The second sentence in this Wikipedia article says "A sequence {fn} of functions converges uniformly to a limiting function f if the speed of convergence of fn(x) to f(x) does not depend on x." I was under the impression that the speed of convergence may depend on x but is bounded, i.e. f doesn't converge arbitrarily quicker at certain points than at others. Can someone confirm this? And if so that statement should probably be changed. — Preceding unsigned comment added by Zaubertrank (talk • contribs) 04:16, 3 March 2012 (UTC)


 * "Speed of convergence" is a questionable concept. Mathematical things don't really have "speeds", even though they can be used to describe speeds in physics. 98.67.106.59 (talk) 20:11, 4 August 2012 (UTC)


 * No, speed of convergence is a valid concept (though it is better to say "rate", not "speed"). But indeed, the phrase "speed...does not depend on x" is somewhat misleading. On the other hand, "rate...is bounded" is hardly clear enough. Boris Tsirelson (talk) 20:19, 24 April 2014 (UTC)
 * Maybe this equivalent formulation could help: $$|f_n(x_n)-f(x_n)|\to0$$ for every sequence $$(x_n)_n.$$ Boris Tsirelson (talk) 20:30, 24 April 2014 (UTC)

Uniform convergence of analytic functions in the complex plane
My textbook on complex analysis states that if a sequence of analytic functions converges uniformly on a region of the complex plane, then the limit of the sequence is analytic in that region. This seems an important result (it shows that complex-differentiable functions are better behaved than real-differentiable functions) yet this article doesn't mention it at all. Perhaps it should be added under a new heading called "To analycity"? — Preceding unsigned comment added by JacekW (talk • contribs) 09:05, 27 January 2013 (UTC)

Almost uniform convergence
Ridiculously, this section says "converges uniformly almost everywhere", and then the opposite: "does not mean that the sequence converges uniformly almost everywhere as might be inferred from the name". Boris Tsirelson (talk) 14:06, 24 April 2014 (UTC)


 * To be honest I am not familiar with the this use of the phrase "almost uniform convergence". I glanced in a couple of texts hoping to find a reference and straighten this section out, the only one I could get my hands on that seems to discuss what this section wants to mention was "Real Mathematical Analysis" by Pugh.  In an exercise on Egorov's theorem, he defines "nearly uniform convergence" to be what this article wants to define as "almost uniform convergence" and he uses the phrase almost uniform convergence to mean uniform convergence almost everywhere.  But I am personally starting to think this is not important enough an idea to merit mentioning.  I suggest we nuke the section. Thenub314 (talk) 15:45, 24 April 2014 (UTC)


 * "a property holds almost everywhere if the set of elements for which the property does not hold is a set of measure zero" (a quote from the article "Almost everywhere" linked from here). It is implicitly assumed that "the property" if defined (either holds or does not hold) for each point (of a measure space). Therefore the phrase "converges uniformly almost everywhere" is neither true nor false; rather it is meaningless, just like the phrase "converges uniformly at this point". Uniformity is about sets, not about points.
 * But Egorov's theorem deserves to be mentioned. "Nearly uniform convergence" sounds good. The link to "Almost everywhere" is erroneous. Boris Tsirelson (talk) 20:06, 24 April 2014 (UTC)


 * Agreed, I will try my hand at re-writing it. Thenub314 (talk) 20:11, 24 April 2014 (UTC)


 * I had the same problem as Boris (the section seemed to be saying contradictory things) so I grabbed my copy of Bartle ("The Elements of Integration and Lebesgue Measure") and edited the section. I confirmed the definition given there indeed has the implications mentioned (Egorov's theorem, AE convergence, convergence in measure) Shaun H. 71.2.44.232 (talk) 03:02, 3 December 2014 (UTC)
 * Nice; thank you.
 * Now I only bother about "Egorov's theorem guarantees that on a finite measure space..."; it is meant a space with a finite measure, of course, but could the reader think that a finite space (with measure) is meant? That is, only finitely many points? Boris Tsirelson (talk) 07:14, 3 December 2014 (UTC)

Recent edits by SergeyLiflandsky
1. Applications -> To continuity: is f continuous on S or I?

2. Proof: "Since fn is continuous at x0..." - which n is meant? If it is an arbitrary n, then $$\delta(\varepsilon)$$ is rather $$\delta(n,\varepsilon).$$

Boris Tsirelson (talk) 06:03, 26 July 2015 (UTC)

Centralized discussion on proofs
See WT:MATH — Arthur Rubin  (talk) 17:58, 29 September 2015 (UTC)
 * However, the discussion of whether the proof is correct and placed in (a) correct section is not really discussed there. — Arthur Rubin  (talk) 18:03, 29 September 2015 (UTC)

"Cauchy's wrong theorem"
There is some controversy on whether Cauchy actually made a mistake here or not. See the following for some lively discussion: one, two, three, four. Double sharp (talk) 08:56, 22 May 2020 (UTC)