User talk:Bgorven

Statistical understanding of AdaBoost
Hello Bgorven,

On March 12, 2014, you made a substantial change to the AdaBoost article which among other things added the following:


 * Specifically, in the case where all weak learners are known a priori, AdaBoost corresponds to a single iteration of the backfitting algorithm in which the smoothing splines are the minimizers of $$\sum_{i=1}^n e^{-Y_i \hat\mu(x_i)} + \infty \int_{x_1}^{x_n} \hat\mu''(x)^2 \,dx$$, that is: $$\hat\mu_i$$ fits an exponential cost function and is linear with respect to the observation.

Is the infinity in the formula a typo? Also, do you have a source for this? I tried finding information connecting AdaBoost to backfitting, but could not find one.

I am going to add fact for now.

« D. Trebbien ( talk ) 19:32, 8 May 2016 (UTC)