Talk:Tikhonov regularization

Generalized Tikhonov - correction
It should be $$\alpha^2 Q$$ instead of just $$Q$$.

General Form
Why not mentioning the general form incorporating an arbitrary smoothing operator (i.e. derivation operators) instead of just the identity matrix in special? Ref.: Hansen, P. C.: Rank-Deficient and Discrete Ill-posed Problems

No connection to Bias-Variance
I am not convinced that there is a link to to Bias-variance dilemma and reference sentence do not support this. Bias-variance is about model selection, regularisation do not address model selection but well-posedness. --mcyp (talk) 02:00, 29 June 2021 (UTC)

Weird reformulation of definition
So the ridge estimator is defined as the ordinary least square one with the difference that one adds an additional multiple of the identity matrix to the moment matrix. There is one additional parameter, namely lambda. Now, in the "reformulation", where lambda is understood as a Lagrange multiplier, there is an additional parameter c which is nowhere chosen. This does not make sense. This minimum over beta is a function of lambda and c. I think what you mean is just the norm of beta, that is, c=0.--Limes12 (talk) 15:08, 8 August 2020 (UTC)

Proposed merge of Ridge regression into Tikhonov regularization
Looks like an IP added some extra content that could be integrated here, though merging will probably require expertise. ~ Ase1este charge-paritytime 10:16, 10 March 2021 (UTC)

The content here doesn't seem to add much beyond what is already on the Tikhonov page. Everything in "Mathematical details" section is covered by in Tikhonov Regularisation section (after History). The only thing that strikes me as information that should be merged/added to the Tikhonov page would be the variance of the estimator, and I think that is something that should be expanded on and linked to the bias/variance tradeoff more generally. Alan O&#39;Callaghan (talk) 11:44, 3 May 2021 (UTC)


 * Support - I think this is a good idea, although I think it will be hard to write the lead with respect to prominence in reliable sources. The article currently has a useful footnote: "In statistics, the method is known as ridge regression, in machine learning it is known as weight decay, and with multiple independent discoveries, it is also variously known as the Tikhonov–Miller method, the Phillips–Twomey method, the constrained linear inversion method, L2 regularization, and the method of linear regularization". In fact, even this footnote buries the lead: this method is much better known as ridge regression and L2 regularization. Despite being a stub, the Ridge Regression article gets more pageviews than this one! This needs attention from someone with knowledge of the relevant science history, since reliable sources will typically talk about "L2 regularization" or "Ridge estimators/regression", and so who if anyone should be reflected in the title (Tikhonov vs Miller vs etc.) is an open question to me. Suriname0 (talk) 18:31, 30 October 2021 (UTC)
 * I think there is an argument to merge the articles and rename the resulting article as Ridge regression. At least for me, ridge regression is the name I recognize. ChromeGames923 (talk · contribs) 03:24, 8 November 2021 (UTC)


 * Support Agnerf (talk) 09:00, 12 December 2021 (UTC)