Wikipedia:Reference desk/Archives/Mathematics/2010 July 30

= July 30 =

Bayesian inference – probability convergence to zero or 1.
The article bayesian inference includes the statement "As evidence accumulates, the degree of belief in a hypothesis ought to change. With enough evidence, it should become very high or very low." Is there any mathematical justification for this statement?--115.178.29.142 (talk) 02:44, 30 July 2010 (UTC)
 * Yes. Let A be the hypothesis and X be a piece of evidence, which for simplicity we will assume is binary. If A is (unknown to us) true, then the expectation of the posterior probability of A after observing whether X is true or not is
 * $$P(X|A)P(A|X)+P(\neg X|A)P(A|\neg X)=P(A)\left[\frac{P(X|A)^2}{P(X)}+\frac{(1-P(X|A))^2}{1-P(X)}\right].$$
 * The thing in brackets is always at least 1, strictly if X and A are dependent. So if A is true then any evidence is expected to increase our credence for A. By a similar calculation, if A is false then any evidence is expected to decrease our credence. Thus, after lots of evidence is collected, the credence will be very close to 1 or 0, depending on whether A is true or not. -- Meni Rosenfeld (talk) 06:36, 30 July 2010 (UTC)

integrate sin(x)^2*cos(x)^2
What's the easiest way to evaluate $$\int \sin^2x\cos^2xdx$$? I've tried two methods; making the substitution $$u=\sin(x)$$, and converting $$\sin^2x$$ to $$\frac{1-\cos2x}{2}$$ and also making a similar conversion for the cosine.--115.178.29.142 (talk) 03:29, 30 July 2010 (UTC)

I've also tried $$\sin^2x\cos^2x=(\sin x\cos x)^2=(\frac{\sin2x}{2})^2$$


 * Well you've managed to go from a fourth power to a square. Just do that sort of thing again. Dmcq (talk) 03:54, 30 July 2010 (UTC)


 * p.s. the easiest method is to use the Wolfram Mathematica online integrator, but I'm guessing you don't mean easy in that sense :) Dmcq (talk) 03:57, 30 July 2010 (UTC)


 * I think your second substitution is the quickest. There may be an even quicker method, but we first have to find it, which would require some time and effort, and would make it not competitive! ;-) --pm a  07:55, 30 July 2010 (UTC)
 * I was thinking the second and then the first would be good. Dmcq (talk) 09:06, 30 July 2010 (UTC)
 * You can use the reduction formula $$I_n = -\frac{1}{n}\sin^{n-1}x\cos{x}+\frac{n-1}{n}I_{n-2}$$, where $$I_n = \int \sin^n{x}$$. Use the identity $$\cos^2{x} = 1-\sin^2{x}$$ to make the integral a sum of powers of $$\sin{x}$$. Readro (talk) 09:30, 30 July 2010 (UTC)

Tits group
Why is the name “Tits group” censored at ? --84.61.131.18 (talk) 18:39, 30 July 2010 (UTC)


 * You ask why: but they give an explanation don't they? (However my belief is that the remedy is even worse than the problem, and turns out to be another low quality joke. Anyway, the wiki article Tits group had a history of vandalism as well, indeed.)--pm a 19:01, 30 July 2010 (UTC)
 * If it helps, the name is pronounced "Teets".--RDBury (talk) 19:15, 30 July 2010 (UTC)


 * It's probably related to the Scunthorpe problem. -- 174.24.200.206 (talk) 16:56, 31 July 2010 (UTC)

Is it allowed to broadcast the word “tits” on ATSC or FM in the United States? --84.62.215.188 (talk) 19:04, 4 August 2010 (UTC)

Determining an eigenvalue
I am currently working on the following question and can get no further. Can someone please help me out?

"Let Q be a (2n+1) x (2n+1) orthogonal matrix with det Q = 1. Show that Q has a unit eigenvalue."

As Q is orthogonal, $$QQ^{T}=I$$ so $$Q^{-1}=Q^{T}$$. As you can take the determinant of a matrix along any row or column, determining the determinant of Q along the first row gives the same result as down the first column of $$Q^{T}$$. So, $$|Q - tI| = |Q^{T} - tI|$$ but also $$|Q^{-1} - tI| = |Q^{T} - tI|$$ hence $$|Q - tI| = |Q^{-1} - tI|$$. We also know that if p is a non-zero eigenvalue of A then $$p^{-1}$$ is an eigenvalue of $$A^{-1}$$. So, we have t an eigenvalue of Q and $$t^{-1}$$ an eigenvalue of $$Q^{-1}$$ but as these two matrices have the same characteristic equation, we must have t and its inverse being equal, ie plus or minus one.

I am not convinced by my above argument. I haven't used the fact that Q is (2n+1) x (2n+1), though I have used that its determinant is one and that it's orthogonal. But surely by my argument, an orthogonal matrix cannot have an eigenvalue that isn't plus or minus one? Thanks asyndeton   talk  20:47, 30 July 2010 (UTC)


 * I like the argument until the last step. You deduce that $$Q$$ and $$Q^{-1}$$ have the same characteristic equation; therefore, they have the same set of eigenvalues.  Then you say that an eigenvalue $$p$$ of $$Q$$ implies an eigenvalue $$p^{-1}$$ of  $$Q^{-1}$$, so the set must contain both of these.  Now if we had even dimensional matrices, I could satisfy this by just pairing up the eigenvalues (for example a 2 by 2 matrix with eigenvalues 2 and 0.5).  However, this doesn't work for odd dimensions: you always have one left over. Then I think your logic works fine.  Martlet1215 (talk) 22:20, 30 July 2010 (UTC)


 * Ah, very nice. And very subtle. Thanks, that's made my night! asyndeton   talk  22:28, 30 July 2010 (UTC)


 * You may also state a property of the characteristic polynomial of an orthogonal matrix Q, in dimension m. Since $$Q-z I=-Qz(Q-z^{-1})^T$$ we get
 * $$p_Q(z):=\det(z-Q)=(-z)^mp_Q(z^{-1})

$$. This certainly implies that 1 is a root of $$p_Q$$ if m is odd. The property $$P(z)=(-z)^m P(1/z)$$ translates into the fact that P is "palindromic" if m is even, and "antipalindromic" if m is odd (I can't remember the proper term now). --pm a 06:24, 31 July 2010 (UTC)

From a geometric point of view, the orthogonal matrices, of any size, are those that preserve the scaler product (a.k.a. the dot product). This means that orthogonal matrices preserve distances and they preserve angles.

(To see this, think of vectors u and v as n × 1 matrices and note that u⋅v = uTv. For a matrix M we have (Mu)⋅(Mv) = (Mu)T(Mv) = uT(MTM)v. And uTv = uT(MTM)v for u and v if and only if MTM is the n × n identity matrix, i.e. if and only if M is an orthogonal matrix.)

The added condition that det(M) = 1 means that M preserves orientation. (Orthogonal matrices have determinant ±1: they either preserve or reverse orientation.) Finally we see that the orientation preserving orthogonal matrices are rotations about the origin. The question now becomes: "Can you show that a rotation about the origin in an odd dimensional space keeps at least one line passing through the origin pointwise fixed?" — Fly by Night  ( talk )  10:36, 31 July 2010 (UTC)
 * It's probably worth clarifying that: keeps at least one line passing through the origin pointwise fixed. That clarifies that we're looking for an eigenvector with eigenvalue 1, not any eigenvector. --Tango (talk) 14:41, 31 July 2010 (UTC)
 * Yeah, well said! I've just added that to my post to avoid confusion. — Fly by Night  ( talk )  15:01, 31 July 2010 (UTC)