Wikipedia:Reference desk/Archives/Mathematics/2023 November 15

= November 15 =

Deceptively simple function yields chaos?
While working on a project recently I was looking for a way to compare a value with its inverse in such a way that both had the same "signature". What I came up with was a fairly straight-forward function, f(x) = abs(x - 1/x). It seemed to fit the bill so I moved on. A few days later, out of curiosity it occurred to me to take a look at any possible fixed points of the function. To my dismay, not only it didn't seem to have any, the output generated was a peculiar stream of "order and chaos". Discarding the "abs" bit of the calculation, the resulting function is obviously very close to the graph of f(x) = x which (I suppose) could be said to have a "fixed point everywhere". But in this case the fixed points are rather illusive, to say the least. Any thoughts on what is going on with this unusual function? Earl of Arundel (talk) 14:08, 15 November 2023 (UTC)


 * There is one fixed point, at $$x=\tfrac 12\sqrt 2.$$ It cannot be found by fixed-point iteration; the point is non-attracting. For example, starting with $$x_0=0.7,$$ which is fairly close to $$\tfrac 12\sqrt 2=0.7071...,$$ we get $$x_1=f(x_0)=51/7=0.7285...$$ and $$x_2=f(x_1)=2299/3570=0.6439...$$: we are quickly drifting away instead of getting closer. Locally, the function is continuously differentiable and the absolute value of its derivative, which should not exceed $$1$$ for convergence, is close to $$3.$$ --Lambiam 18:05, 15 November 2023 (UTC)


 * Wow, so a fixed point exists but is not approachable. I didn't even know that was possible! Interesting how the function oscillates between erratic and smooth behavior ad infinitum during these fixed-point iterations too.X-minus-inverse.png


 * Note how the graph kind of looks like a fractal of some sort. Also, the more iterations, the more dense those near-vertical spikes (are those singularities?). Earl of Arundel (talk) 18:45, 15 November 2023 (UTC)
 * There's actually a really simple criterion for whether iterating the function near such a fixed point tends to converge or run away. It's probably in our article, but you might have fun trying to figure it out yourself.  I used it in a chalk-talk competition in high school (after figuring it out myself). Did I win?  Funny, can't quite remember.  Did pretty well anyway. --Trovatore (talk) 19:00, 15 November 2023 (UTC)
 * For a continuously differentiable function that has a fixed point, the criterion is that the absolute value of its derivative does not exceed $$1$$ in some neighbourhood of the fixed point. For other functions, the derivative may be replaced by the difference quotient over a finite interval contained in the neighbourhood. --Lambiam 23:32, 15 November 2023 (UTC)
 * Ah, I see you attended my chalk talk. --Trovatore (talk) 23:33, 15 November 2023 (UTC)
 * I would say if the absolute difference from one iteration to the next is NOT decreasing then either there is no fixed point OR perhaps the initial value was improperly chosen? Well it is probably more complicated than that! (I am not much of a mathematician myself.) Ah, I see now, Banach fixed-point theorem. Earl of Arundel (talk) 19:26, 15 November 2023 (UTC)