Talk:Biconjugate gradient stabilized method

Step 2
Step 2 is confusing, there seems to be two different r0? —Preceding unsigned comment added by 130.37.28.128 (talk • contribs) 09:38, April 13, 2010 (UTC)


 * Could be a problem with your browser. The second r-zero is in fact "r hat"-zero. Kxx (talk | contribs) 14:07, 13 April 2010 (UTC)


 * I was confused too, as the spanish page says:
 * Elige un vector arbitrario $$\boldsymbol{\hat{r}}_0$$ tal que $$(\boldsymbol{\hat{r}}_0,\boldsymbol{r}_0) \neq0$$, por ejemplo, $$\boldsymbol{\hat{r}}_0=\boldsymbol{r}_0$$
 * while the english one says:
 * Choose an arbitrary vector $r ̂_{0}$ such that $( r ̂_{0}, r _{0}) ≠ 0$, e.g., $r ̂_{0} = r _{0}$
 * Which one is right? Should the english page be changed? Nicolas Bigaouette (talk) 21:40, 26 April 2010 (UTC)


 * This is how your reply looks on my machine: http://img153.imageshack.us/img153/1053/61070581.png. I don't see any problems. Kxx (talk | contribs) 08:14, 27 April 2010 (UTC)

Fails if Residual is Eigenvector

 * I'm wondering if anyone else has noticed the problem that if $$r_0$$ is any eigenvector of $$A$$ and $$\hat{r_0}=r_0$$, step 5.8 will evaluate as $$\omega_i = 0/0$$, thus wreaking havoc. Mind you, if this case does crop up, there's an exact solution of $$x = x_0 + r_0/\lambda$$, where $$\lambda = (r_0^T A r_0)/(r_0^T r_0)$$.  Another instant failure case is if $$r_0^T A r_0 = 0$$ and $$\hat{r_0}=r_0$$, in which case step 5.5 divides by zero, though it might be that this only happens when either the above is true or the initial guess is orthogonal to the solution.  Are there are any other instant failure cases to worry about? --Ndickson (talk) 05:36, 22 February 2012 (UTC)


 * Even if it does not fail instantly, it can still fail after some iterations because there is no guarantee that the method will converge. So people generally do not care if the failure is instant or not. When it fails, you can choose another initial guess or move to another method such as GMRES. Kxx (talk &#124; contribs) 07:18, 28 July 2012 (UTC)

Is the preconditioned algorithm correct?
I'm rather confused by how the preconditioners are defined. From modified system $$\tilde{A}\tilde{x}=K_1^{-1}AK_2^{-1}K_2 x = K_1^{-1}b$$, it looks like $$K_1$$ and $$K_2$$ are intended to be left and right preconditioners respectively. However as I understand it, this means that we should have either $$K_1$$ ≈ $$A$$ and $$K_2$$ is the identity matrix, or $$K_2$$ ≃ $$A$$ and $$K_1$$ is the identity matrix, depending on if a left or right preconditioner is to be used. At worst it should be that $$K_1^{-1}AK_2^{-1}$$ should be well conditioned, but I can't figure out where the relation $$K=K_1 K_2$$ ≈ $$A$$ comes from.

With that in mind, by naively solving the modified system $$\tilde{A}\tilde{x}=K_l^{-1}AK_r^{-1}K_r x = K_l^{-1}b$$ using the unpreconditioned algorithm, I find that the following steps should be modified:

1. $$r_0 = K_l^{-1}(b - Ax_0)$$

4. $$y = K_r^{-1}p_i$$

5. $$\nu_i = K_l^{-1}Ay$$

8. $$z = K_r^{-1}s$$

9. $$t = K_l^{-1}Az$$

10. $$\omega_i = (t,s)/(t,t)$$

where $$K_l$$ and $$K_r$$ are left and right preconditioners. Only one can be used by simply setting the other to $$I$$, or both can be used if they make $$K_l^{-1}AK_r^{-1}$$ more suitable to iterative solving.

Is that correct, or am I missing something? — Preceding unsigned comment added by Fromain (talk • contribs) 21:32, 9 June 2015 (UTC)