Talk:Partial correlation

partial correlation is standardized regression coefficient...
Shouldn't it be mentioned that the partial correlation coefficient is equivalent to the coefficient of a regression Y = a + b_0*X + b*Z + e where Y, X and Z are standardized to mean zero and unit variance? —Preceding unsigned comment added by Aenus (talk • contribs) 13:51, 16 November 2010 (UTC)


 * I don't think this is true. Please elaborate. Skbkekas (talk) 14:40, 16 November 2010 (UTC)


 * I've checked this out by hand -- it's not true, but something similar and more symmetric is true: If you also regress X = c + d_0*Y + f*Z + u, then the square root of the product of the estimates of b_0 and d_0 equals the partial correlation coefficient (even if you don't standardize the variables).  This makes sense to me, because I have previously seen the same result for the correlation coefficient in the absence the other variable Z. Duoduoduo (talk) 16:49, 19 November 2010 (UTC)


 * Thank you for helping out - the problem was a misunderstanding from my part. User:Aenus (17:41, 29 November 2010 (UTC)

Relation to formula in Multiple correlation article
The article on multiple correlation gives the following formula for the R2 of a multiple regression:


 * R2 = c'Rxx−1c

where the elements of the vector c  are the correlations between right-side variables Xi and the left-side variable Y, and Rxx is a matrix whose ij element is the correlation between the  i  and  j  right-side variables. Now if we write  Rxx−1   as equal to PP', we have  R2 = the sum of the squared elements of the vector  c'P.

Question: Is the ith element of c'P  equal to the partial correlation of Y with Xi conditional on the other X's?  If so, maybe this should go in the article. Duoduoduo (talk) 18:06, 18 November 2010 (UTC)

Partial distance correlation
Removed a sentence about partial distance correlation from the Partial correlation article. Partial distance correlation section has been entirely removed from distance correlation and the term partial distance correlation as of now is undefined.Mathstat (talk) 19:55, 5 January 2011 (UTC)

Terminology (scalar product / dot product)
I found the odd syntax for the dot product slightly confusing: Let $$\langle\mathbf{v},\mathbf{w} \rangle$$ the scalar product between the vectors v and w. Can't we just use a dot? The angle brackets (in my field at least) are often associated with the average of a random variable/function. Just a thought? (I might be wrong - just wondering what the consensus is?) Thanks for the useful article though! Lionfish0 (talk) 09:39, 10 March 2011 (UTC)

Computation using regression
The matlab implementation includes the constant term (that I've just mentioned in the article). I'm not sure how to cite it, but it's in the code!

z1 = [ones(n,1) z]; resid = x - z1*(z1 \ x);

on lines 192 and 193 in my version of matlab's stat's toolbox. Where z is the matrix of other variables, z1 is the matrix once the constant term is added and x is the variable being predicted by the regression. resid is the residual.

I've spent several hours today wondering why my code returned a different partial correlation! I feel other people might like this info on the wiki page. Lionfish0 (talk) 13:23, 10 March 2011 (UTC)

interpretation
It would be useful to include a section discussing what one does with a partial correlation. Why is someone interested in this quantity? What conclusions can be drawn from it? How is the partial correlation of x and y controlling for z different from the regression of y on x and z, and when should one use each of the two approaches? Thank you. — Preceding unsigned comment added by 165.124.241.177 (talk) 19:39, 9 November 2011 (UTC)

help updating citation
I'm not sure if it's temporary, but the Springer link at See Also... to the Mathematics encyclopedia isn't working. I found the article at: http://www.encyclopediaofmath.org/index.php?title=Partial_correlation_coefficient&oldid=14288 ... Is this a problem with the template used to make the citations? If someone with more expertise than me could make the edit, that would be super cool; it's a bit outside my skill level so far. Thanks in advance 216.59.115.74 (talk) 21:30, 4 January 2012 (UTC)
 * Fixed. Encyclopedia of Mathematics reorganised their site a while back so the all the links went stale. The template still works but the   parameter has to be changed by hand. Thanks for pointing out the problem. Qwfp (talk) 17:04, 5 January 2012 (UTC)

Please add an example that most people could understand
I can only second that - add an example. this is a quite technical article, but on a topic with wide interest. Postdeborinite (talk) 18:58, 17 June 2015 (UTC) Self-explanatory — Preceding unsigned comment added by Jbell sci (talk • contribs) 12:44, 5 December 2012 (UTC)

"Using linear regression": Z defined as scalar random variable but z_i used as vector
In the section "Using Linear Regression", Z is defined as scalar random variable, but then z_i used as vector in scalar product expressions. Am I missing something? — Preceding unsigned comment added by Craniator (talk • contribs) 13:33, 25 April 2015 (UTC)

It looks like (w_X)* is the regression coefficients. If this is the case, it would be helpful to say this explicitly. — Preceding unsigned comment added by Craniator (talk • contribs) 15:11, 25 April 2015 (UTC)

Updating File:PartialCorrelationGeometrically.jpg
r_x and r_y should be replaced with e_x and e_y to conform with the text. כובש המלפפונים (talk) 12:11, 28 September 2017 (UTC)

Isn't the sum of errors always zero in a linear regression?
I see a huge formula that is valid for calculating the correlation coefficient in the general case (non zero sum). Here the errors sum to zero and we get a much simpler formula. — Preceding unsigned comment added by 158.110.166.131 (talk) 10:56, 13 December 2017 (UTC)

Why not taking into account the reduction of the number of degrees of freedom?
The sample partial correlation is then given by the usual formula for sample correlation, but between the residual sores of the covariates.


 * $$\hat{\rho}_{XY\cdot\mathbf{Z}}=\frac{N\sum_{i=1}^N e_{X,i}e_{Y,i}-\sum_{i=1}^N e_{X,i}\sum_{i=1}^N e_{Y,i}}

{\sqrt{N\sum_{i=1}^N e_{X,i}^2-\left(\sum_{i=1}^N e_{X,i}\right)^2}~\sqrt{N\sum_{i=1}^N e_{Y,i}^2-\left(\sum_{i=1}^N e_{Y,i}\right)^2}}$$
 * $$=\frac{N\sum_{i=1}^N e_{X,i}e_{Y,i}}

{\sqrt{N\sum_{i=1}^N e_{X,i}^2}~\sqrt{N\sum_{i=1}^N e_{Y,i}^2}}.$$ However, no account is taken of the reduction of the number of degrees of freedom of the variables $$e_{X,i}$$ and $$e_{Y,i}$$. Is that not a problem?

Derive Pcorr from correlation of residuals and inverse of covariance matrix
the article says you can compute pcorr in these two ways, but there is no prove or reference that the correlation of residual and the inverse of covariance are equivalent. Could someone help me to prove this equivalence? — Preceding unsigned comment added by 170.223.207.26 (talk) 21:06, 22 January 2020 (UTC)

Error in introduction
"The value –1 conveys a perfect negative correlation controlling for some variables (that is, an exact linear relationship in which higher values of one variable are associated with lower values of the other); the value 1 conveys a perfect positive linear relationship, and the value 0 conveys that there is no linear relationship. " This should only hold true in the case of normally distributed random variables. In all other cases, the perfect relationship does not necessarily follow.