Talk:Predictor–corrector method

Proposed example
Example of a trapezoidal predictor-corrector method.

In this example h = $$\Delta{t} $$, $$ t_{i+1} = t_{i} + \Delta{t} = t_{i} + h $$


 * $$ y' = f(t,y), \quad y(t_0) = y_0. $$

first calculate an initial guess value $$\tilde{y}_{g}$$ via Euler:


 * $$\tilde{y}_{g} = y_i + h f(t_i,y_i)$$

next improve the initial guess through iteration of the trapezoidal rule:


 * $$\tilde{y}_{g+1} = y_i + \frac{h}{2}(f(t_i, y_i) + f(t_{i+1},\tilde{y}_{g})).$$


 * $$\tilde{y}_{g+2} = y_i + \frac{h}{2}(f(t_i, y_i) + f(t_{i+1},\tilde{y}_{g+1})).$$

...
 * $$\tilde{y}_{g+n} = y_i + \frac{h}{2}(f(t_i, y_i) + f(t_{i+1},\tilde{y}_{g+n-1})).$$

until some fixed value n or until the guesses converge to within some error tolerance e :


 * $$ | \tilde{y}_{g+n} - \tilde{y}_{g+n-1} | <= e $$

then use the final guess as the next step:


 * $$y_{i+1} = \tilde{y}_{g+n}.$$

If I remember correctly, the iterative process converges quadratically. Note that the overall error is unrelated to convergence in the algorithm but instead to the step size and the core method, which in this example is a trapezoidal, (linear) approximation of the actual function. The step size h ( $$\Delta{t} $$ ) needs to be relatively small in order to get a good approximation. Also see stiff equation

Jeffareid (talk) 02:42, 25 July 2009 (UTC)


 * The relation to the https://de.wikipedia.org/wiki/Picard-Iteration might be a worthwhile refernce. I am dubious on the quadratic convergence claim as it looks more like a type of gradient descent to me. 2001:638:904:FFC8:3433:CB4E:3261:66DB (talk) 23:32, 11 March 2023 (UTC)