User:Jlerner/Sandbox/IVP

Initial Value Problems
A boundary value problem specifies a solution of interest by imposing conditions at more than one point. For instance, the Blasius problem is the differential equation $$y' ' ' = - y \, y' '/2$$ with boundary conditions $$y(0) = 0, y'(0) = 0, y'(\infty) = 1$$.

generates an image \sqrt{\pi} whose alt text is "Square root of pi"

The second term on the right is the local error, a measure of how well the method imitates the behavior of the differential equations. Reducing the step size reduces the local error and the higher the order, the more it is reduced. The first term measures how much two solutions of the differential equations can differ at $$t_n + h_n$$ given that they differ at $$t_n$$ by $$y(t_n)-u(t_n)=y(t_n)-y_n$$, i.e., it measures the stability of the problem. The argument is repeated at the next step for a different local solution. In this perspective, the numerical method tries at each step to track closely a local solution of the differential equations, but the code moves from one local solution to another and the cumulative effect depends on the stability of the equations near $$y(t)$$. With standard assumptions about the initial value problem, the cumulative effect grows no more than linearly for a one-step method. Suppose that a constant step size $$h$$ is used with a method having a local error that is $$O(h^{p+1})$$. To go from $$t_0$$ to $$t_f$$ requires $$O(1/h)$$ steps, hence the worst error on the interval is $$O(h^p)$$. For this reason a method with local error $$O(h^{p+1})$$ is said to be of order $$p$$.

In this view of the error, the stability of the initial value problem is paramount. A view that is somewhat better suited to methods with memory