Wikipedia:Reference desk/Archives/Mathematics/2021 October 2

= October 2 =

segmenting a general curve to approximate it with cubics
To approximate an arbitrary parametric curve in 3-space with line segments, as one does, I use the vector rejection of the second derivative on the first to estimate how soon the difference between the line and the curve will exceed the desired resolution; this determines the spacing of the sample points.

In various unrelated projects, I have the analogous problem of fitting cubic splines. What is the customary way to place the minimum number of knots for a desired accuracy? (A quick search finds answers for circles and ellipses, but I want a more general answer.) —Tamfang (talk) 19:28, 2 October 2021 (UTC)


 * The error (supremum of the absolute difference between a continuous curve and the approximating spline) is $$O(h^4),$$ where $$h$$ is the distance between successive points. If $$h$$ is small enough, the error becomes proportional to $$h^4,$$ so halving $$h$$ means reducing the error by a factor of 16. If you can determine the error $$e_h$$ for a given small value of $$h$$, and you want it to be below $$\epsilon,$$ you should choose a distance less than $$(\epsilon/e_h)^{\tfrac{1}{4}}h.$$ --Lambiam 21:35, 2 October 2021 (UTC)


 * I could express the ideal curve as a Fourier Taylor series around the current point, and read off a bound on $$|\epsilon|$$; but it would be pessimistic, because what matters is how much of the error is orthogonal to the curve. (I say "a Fourier Taylor series" singular because I'm working in the complex plane.) —Tamfang (talk) 23:49, 2 October 2021 (UTC)


 * If you can express the target curve as a Fourier Taylor series, then, apparently, it is given in analytic form. Is it given in the form of a function $$f:\R\to\Complex$$? If so, does the following make sense? Let $$f(p)$$ be the parametric representation of the target and $$s(p)$$ of an approximating spline. Put $$e_p=f(p){-}s(p)$$. Express $$e_p\overline{e_p}$$, the square of its modulus, as a Fourier Taylor series, of which the first few terms should vanish. This should give a reasonable estimate of the square of the orthogonal error. --Lambiam 11:53, 3 October 2021 (UTC)


 * (I meant a Taylor series. I do that sometimes.) Yes, that makes sense. —Tamfang (talk) 14:30, 3 October 2021 (UTC)


 * I guess I'll use, for the step size, the fourth root of $$\frac{4!\,T}{|f^{iv}(t)|}$$ where T is the tolerance; which is more or less what you said. This does not promise a minimum number of knots, but I guess the overkill ratio won't exceed 2.  —Tamfang (talk) 01:29, 4 October 2021 (UTC)


 * I assume $$t$$ denotes the value of parameter $$p$$ at the knot. The ratio will be very close to 2. But there is a very small chance that the fourth derivative purely accidentally almost vanishes there, so consider the "overkill" a safety feature; you can use an additional safety guard against such an eventuality by using $$\frac{120T}{\operatorname{max}(5|f^{iv}(t)|,|f^v(t)|)}.$$ --Lambiam 01:53, 4 October 2021 (UTC)


 * Good idea, since a double root in $$f^{iv}$$ is even less likely. Or rather, the smaller of $$\sqrt[4]{\frac{4!\,T}{|f^{iv}(t)|}}$$ and$$\sqrt[5]{\frac{5!\,T}{|f^{v}(t)|}}$$. —Tamfang (talk) 03:20, 4 October 2021 (UTC)
 * No, the error is still $$O(h^4),$$ also when the fourth derivative vanishes at the starting point, so there is no basis for taking a fifth root. Also, make sure you don’t divide by zero. To guard against both derivatives almost vanishing – extremely unlikely unless the curve is a straight line, but still not impossible –, include 1.5 times the previous step size in the slate of candidates (by using $$h_1^4=T/\operatorname{max}(|\tfrac{1}{4!}f^{iv}(t)|,|\tfrac{1}{5!}f^v(t)|,0.2h_0^{{-}4}T),$$ where $$h_0$$ is the previous step size and $$h_1$$ is the new step size). The only division-by-zero risk then remaining is in computing the first step: don't set $$h_0=0.$$ --Lambiam 08:09, 4 October 2021 (UTC)


 * I have no idea how to do what you are asking. But I might start by reading about Gaussian quadrature and seeing if anything there looks helpful.  2601:648:8202:350:0:0:0:1598 (talk) 06:01, 3 October 2021 (UTC)