Talk:Limit (mathematics)

Questionable example
The article states that f(x) = x²-1/x-1 is undefined at x=0. I would disagree, since it can easily be simplified to f(x) = x+1. It seems the same as arguing that x²/x would be undefined at 0 (or actually any g(x)*x/x). I can see how the example is convenient in other ways, because the formula is simple, but I would propose to either replace it by sin(x)/x, or at least note that the statement "f(x) is not defined for x=0" is debatable and that the example was chosen for its simplicity. - Jay 84.171.79.63 (talk) 19:11, 28 June 2014 (UTC)
 * I don't see your point. The function is not defined at x=0.  and sin(x)/x = sinc(x), which usually has sinc(0)=1.  — Arthur Rubin  (talk) 03:40, 2 July 2014 (UTC)
 * To clarify... the function in the article is f(x) = (x²-1)/(x-1) (rather than f(x) = x²-1/x-1 ) and the article states that it is not defined at x=1 (rather than at x=0). Meters (talk) 22:45, 7 July 2014 (UTC)
 * I still don't see the IP's point. Just because $$\frac {x^2-1} {x-1}$$ can be simplified to x+1, doesn't mean that it is simplified.  And our article "sinc" does specify that sinc(x) = sin(x)/x when x ≠ 0.  One could make a better case that $$e^{x^{-2}}$$ isn't defined at x = 0, but that isn't quite correct either, when we work on the extended real line.  — Arthur Rubin  (talk) 16:08, 9 July 2014 (UTC)
 * I think the IP's point might be that in practice, other than to come up (in a textbook section about limits) with a function with a specific value excluded, no-one would ever define a function like $$\frac {x^2-1} {x-1}$$. A less trivial and perhaps somehow "better" example would be, for instance, $$f(x) = \frac {\ln{x}} {x-1}$$ with a hole at x=1, because it cannot be trivially simplified. - DVdm (talk) 16:53, 9 July 2014 (UTC)
 * I agree that the function in the example is bad. The function simply doesn't have a pole at x=1. It is perfectly well defined in the vicinity of x=1. 95.192.5.53 (talk) 11:36, 20 December 2015 (UTC)

Limit of a sequence
I think this sentence is not correct "On the other hand, a limit $L$ of a function $f(x)$ as $x$ goes to infinity, if it exists, is the same as the limit of any arbitrary sequence $a_{n}$ that approaches $L$, and where $a_{n}$ is never equal to $L$." I mean, it is technically correct, but this is an unneeded tautology. Of course the limit L of a function f(x) is the same as the limit of any $$a_n$$ that approaches L, by definition :D. --ԱշոտՏՆՂ (talk) 07:00, 24 August 2019 (UTC)


 * Most of the elements of a true statement were in that claim, but as written, it was not entirely correct, and what was correct was opaque. I have reworded the paragraph and cited a reliable source.  Hopefully this is easier to understand now.—Anita5192 (talk) 19:46, 25 August 2019 (UTC)
 * Thanks, Anita5192 ^_^ --ԱշոտՏՆՂ (talk) 20:27, 25 August 2019 (UTC)

"Convergence and fixed point"
This section is quite messy. Obviously copy-pasta extracted from reference "[8]", but with insufficient understanding. and so on. Such copy-pasta by totally clueless people should better be avoided. Also, this rather belongs to a more specialized article, like: rate of convergence (maybe best match? move there and replace with a link to there?), or: recurrent sequences, dynamical systems, iterative methods, or ... &mdash; MFH:Talk 00:06, 28 March 2022 (UTC)
 * The first paragraph claims to "formally define convergence", but uses the word "convergence" in the definition, which is actually that of [a special case of] "convergence of order &alpha;", not general convergence -- and convergence of order &alpha; does not necessarily imply the existence of that limit, but rather of such a lim sup.).
 * Then it introduces a function f, but one has to guess that now the author considers the sequence p(n+1) = f(p(n)), which is written nowhere.
 * Then (s)he speaks of "linear convergence" without ever having defined this (actually, &alpha; = 1).
 * Then it says "the series converges", but meaning the sequence and not a [series_(mathematics)|].
 * then, "if it is found that there is something better than linear..." (meaning convergence of order > 1), but it is totally obscure how one may find whether there is something better.
 * Then it speaks of quadratic convergence without ever defining this (actually, &alpha; = 2).
 * Also, it forgets that sequences may have convergence of order in between 1 and 2, the most frequent example being that of the secant method or regula falsi, which has convergence of order &alpha; = (sqrt(5)+1)/2 &approx; 1.6

Essential? not.
Limit is *not* essential for differential or integral calculus! I don't know if anyone isn't taught (in their calculus education) that "taking a limit" is one of two ways to approach the subject, with the other being infinitesimals. The lead is wrong. I also note that while the article claims the importance of Limit in calculus, it proceeds to ignore those applications. Why? Needs to be fixed.174.131.48.89 (talk) 04:41, 23 August 2022 (UTC)
 * Infinitesimals fall under nonstandard analysis and therefore are not mainstream. Also, there is no infinitesimal counterpart to the limit of a sequence.--Jasper Deng (talk) 22:58, 12 July 2024 (UTC)