Wikipedia:Reference desk/Archives/Mathematics/2006 November 27

= November 27 =

Which statistical measure to use to compare rankings?
Hi all,

I have a bunch of tests results from two tests I've run on students. What would be the best statistical test to see how closely the two sets of rankings correlate? (Note that students can have the same scores, so, for instance, I have a number of students who scored 100% on test 1, so all all ranked first). Thanks! --George


 * Have a look at Kolmogorov-Smirnov test. --Lambiam Talk  22:09, 27 November 2006 (UTC)


 * For ranked data I would suggest SRCC. If you leave them unranked, use PMCC. Hope that helps Eŋlishnerd  ( Suggestion? | wanna chat? ) 22:16, 27 November 2006 (UTC)
 * If you are specifically interested in how the tests affect the ranking, then convert the raw scores to rankings (including ties) and use such as Spearman's - but bear in mind that the actual scores contain more information, some of which you lose if going down to just an ordinal measure. Thus the Product Moment test is likely to be more powerful. 86.132.234.166 23:30, 28 November 2006 (UTC)

Complex Analysis and Residues
Could someone explain why (or give a proof), in complex analysis, for a pole of order n :

$$ \operatorname{Res}(f,c) = \frac{1}{(n-1)!} \cdot \lim_{z \to c} \left(\frac{d}{dz}\right)^{n-1}\left( f(z)\cdot (z-c)^{n} \right). $$

(Found in the residue article.)

I haven't got a clue where this formula comes from and don't like using formulas without understanding them. --Xedi 17:55, 27 November 2006 (UTC)


 * That's the residue at a pole of order n. That is, $$ g(z) = f(z)\cdot (z-c)^{n}$$ is holomorphic. It follows directly from Cauchy's integral formula (with g(z) for f(z)); see that article and residue theorem. EdC 19:06, 27 November 2006 (UTC)


 * Yes, pole of order n, of course, forgot to say ! Well, I already looked at those articles but couldn't really find, but yes, I finally got it, thanks.
 * But if I may say, I find there should be at least one complete example of the calculation of a residue where every step would be explained, perhaps in the article methods of contour integration
 * I'll try to do one, I think it'll help me and others, if I get stuck I'll ask further questions here.
 * Thanks --Xedi 19:39, 27 November 2006 (UTC)


 * I've done a example of a calculation, it went wrong at a few points, I hope you can point out where I went wrong (and what else I should add to such an example to make it more worthwhile(for example an utilization of Cauchy's integral theorem)) so I (or someone else) can add it at methods of contour integration as an introductory example.
 * Thanks --Xedi 21:01, 27 November 2006 (UTC)


 * I also noticed many articles talk about finding the residue as finding the coefficient of z^-1 in the Laurent series, but how can one obtain that coefficient without calculating the residue first ?
 * How can one obtain the Laurent series before having the coefficients ? Are there quicker ways to obtain Laurent series than calculating lots of contour integrals ?
 * And is there a page giving different Laurent series (like the one at taylor series ?
 * Thanks --Xedi 18:01, 27 November 2006 (UTC)


 * I am looking for new friends. Twma 03:41, 28 November 2006 (UTC)
 * Thanks very much, exactly what I was looking for. --Xedi 20:14, 28 November 2006 (UTC)

Characteristic equation of eigenvalues - do multiple roots have any significance?
If, say, when finding the eigenvalues of a matrix, and you get a repeated root ((x-1)(x-1)(x+2)=0, for example), does this hold any significance? x42bn6 Talk 18:59, 27 November 2006 (UTC)


 * The algebraic multiplicity of an eigenvalue λ of a linear transformation is significant. As is described in the article about eigenvectors, the dimension of the eigenspace is always less than or equal to the order of the eigenvalue λ. --payxystaxna 20:57, 27 November 2006 (UTC)


 * To follow on, consider one matrix with the characteristic equation suggested.
 * $$ A = \begin{bmatrix}-0.92&1.44&0\\1.44&-0.08&0\\0&0&1\end{bmatrix} $$
 * We see immediately that (0,0,1) is an eigenvector with eigenvalue 1. We also find a perpendicular eigenvector, (3,4,0), with eigenvalue 1. However, the full story is more elaborate. Any linear combination of these two vectors is also an eigenvector with eigenvalue 1. For example, (3,4,5) is one such. Thus we have a 2-dimensional subspace, an eigenspace, of eigenvectors with eigenvalue 1. Perpendicular to that, we have eigenvector (−4,3,0) with eigenvalue −2. (As always, any nonzero scalar multiple of this is equivalent, but the direction is essentially unique.)
 * Now consider a second matrix with the same characteristic equation.
 * $$ B = \begin{bmatrix}1&1&0\\0&1&0\\0&0&-2\end{bmatrix} $$
 * Despite the double root, this matrix does not have a 2-dimensional eigenspace. We have eigenvector (1,0,0) with eigenvalue 1, and we have eigenvector (0,0,1) with eigenvalue −2; but (0,1,0) is not an eigenvector, and there are no other possible directions. --KSmrqT 22:23, 27 November 2006 (UTC)
 * Ah, alright, just a curiosity.  x42bn6  Talk 19:34, 28 November 2006 (UTC)