Wikipedia:Reference desk/Archives/Mathematics/2011 May 11

= May 11 =

Using part of a character table to calculate the size of conjugacy classes
Hello all,

I have the following 5 rows of a 'mystery' group with 7 cclasses:

1 1  1  1  1  1  1 1  1  1  1 -1 -1 -1 4  0  1 -1  2 -1  0 4  0  1 -1 -2  1  0 5  1 -1  0  1  1 -1

and I have been asked to calculate the size of each conjugacy class and the remaining 2 rows of the table. I am not allowed to identify the table with a specific group unless I can justify it.

Now, we can easily get another character as the product of the second and fifth rows of the table, leaving 1 final character. I do not know the order of the group. I am aware that the sum of the degrees squared equals the order of the group, and that the columns are orthogonal in the normal sense and the rows are orthonormal in the character inner product. I also know that the dot product of a column with itself gives us the size of the centralizer of an element in that conjugacy class (i.e. |G| over the size of the conj class), but since we don't have the final row, I can't seem to use the column orthogonality, and since we don't have any full columns or cclass sizes, I can't use row orthonormality, unless I want to obtain lots of linear equations in 7 variables to solve. Can anyone suggest a better, not-too-complicated way to finish the table and find the size of the conjugacy classes? (Not necessarily in that order - it seems like whichever way I do it one will follow easily from the other.)

Thanks! Otherlobby17 (talk) 01:38, 11 May 2011 (UTC)
 * Let G be the order of the group and G1, G2, ... G7 be the order of the conjugacy classes. Taking the sum of squares of characters in row 1 and 5, G1 + G2 + G3 + G4 + G5 + G6 + G7=G and 25G1 + G2 + G3 + G5 + G6 + G7=G. Subtracting gives G4=24. For the last character you get the values in columns 3, 5, 6, 7 are 0 due to orthogonality of columns. From these you get G3=G/6, G5=G/12, G6=G/6, G7=G/4. Use this to compute the orthogonality of row 1 and row 3 and solve for G to get G=120. Once you get this the rest of the table should be straightforward. --RDBury (talk) 03:15, 11 May 2011 (UTC)
 * You ment to write G4=24G1. Bo Jacoby (talk) 06:15, 11 May 2011 (UTC).
 * Actually no, the conjugacy class for the first column is the identity element, so it has 1 element. I did mean to write that out though.--RDBury (talk) 10:29, 11 May 2011 (UTC)
 * So you say that G1=1 ? Bo Jacoby (talk) 12:55, 11 May 2011 (UTC).
 * I see, thankyou! Yes, G1 is 1, that was the obvious fact I was overlooking as it happens. Thanks :) Otherlobby17 (talk) 15:56, 11 May 2011 (UTC)

should gay couples be given the legal right as heterosexual in adopting children?
should gay couples be given the legal right as heterosexual in adopting children? am strongly against.. —Preceding unsigned comment added by Lizz4sunday (talk • contribs) 06:27, 11 May 2011 (UTC)


 * Why on Earth is this on the Math Desk ? On the theory that 1 + 1 = 3 ?  It's probably not appropriate anywhere on the Ref Desk, though, since it's a request for opinions rather than facts. StuRat (talk) 06:37, 11 May 2011 (UTC)


 * For what it's worth, Wikipedia has an article on LGBT adoption. If you want a political discussion, you should visit a  forum. 130.88.134.222 (talk) 08:52, 11 May 2011 (UTC)


 * Not sure what some of those terms mean. Is couple used in the sense of a tuple with two members or as a system of forces producing a moment? What do the terms gay and heterosexual mean? The 'adopting children' sounds more like computing to me. :) Dmcq (talk) 09:54, 11 May 2011 (UTC)
 * By "couple" he means an electric dipole. "Gay" is a configuration dominated by gravitational forces, "heterosexual" one dominated by electrostatic forces. "Legal right" is referring to the right-hand law. This question should be in the science desk. -- Meni Rosenfeld (talk) 10:12, 11 May 2011 (UTC)

Markov chain mapping tools
Assume that I have a file that describes a Markov chain. I can format it in any way necessary. Are there any standard tools for mapping or graphing the Markov chain in an image file? I have done many small ones by hand, but I have some chains with about 5,000 states that I want to map just to demonstrate that they are mostly disjoint. -- k a i n a w &trade; 12:29, 11 May 2011 (UTC)


 * Right now, I'm using graphviz. I have been waiting on one graph for about 4 hours and there is no sign it will be produced anytime soon. Perhaps I need to look into tricks for making graphviz run faster. -- k a i n a w &trade; 15:24, 11 May 2011 (UTC)


 * It seems like you're looking for Markov clusterings. Take a look at that link. It gives "a fast and scalable unsupervised cluster algorithm for networks (also known as graphs) based on simulation of (stochastic) flow in graphs." That algorithm will pick out the network clusters, i.e. sets of vertices with higher than average mutual edge connections. I hope that helps. — Fly by Night  ( talk )  23:34, 11 May 2011 (UTC)


 * Thanks. I killed the graphviz process after letting it run for 24 hours on a single chain. Hopefully this works better. -- k a i n a w &trade; 12:08, 12 May 2011 (UTC)

Another representation theory question: misunderstanding of the way irreducible representations decompose
Hello everyone, I think I'm misunderstanding something in representation theory and I was hoping you could explain: I've scoured the internet but couldn't find anything relevant.

I've just proved that if G is a finite group with centre Z, then if G has a complex irreducible representation which is faithful, then Z is cyclic. However, I'm now trying to show a partial converse: the rest of the problem I am stuck on is this:

"Now assume the order of G is a power of the prime p, and Z is cyclic. If $$\rho$$ is a faithful representation of G, show that some irreducible component of $$\rho$$ is faithful. You may use without fact the proof that since G is a p-group, Z is non-trivial and any non-trivial normal subgroup of G intersects Z non-trivially."

Now I believe since we're working over $$\mathbb{C}$$ with finite groups and representations, we can always decompose $$\rho$$ as a direct sum $$\oplus_i \rho_i$$ of irreducible representations, each of which combines to create a sort of 'block diagonal' action $$\rho$$ on the various G-invariant subspaces of the vector space V on which our group acts - I think this follows from Maschke's theorem but I may be misunderstanding the way a finite dimensional representation of a finite group over a field of characteristic 0 is 'reducible'. However, this clearly can not be the case, as if the action was just 'block diagonal' with zeros in the off-diagonal blocks, then clearly every representation $$\rho_i$$ has a very much non-trivial kernel, so none of these are faithful.

What am I misunderstanding? The question seems to be intending to lead me in a very specific direction but I don't think I'll be able to appreciate it until I grasp what I am misunderstanding about the problem and what I should be thinking instead. Help welcomed and appreciated! Many thanks as always :) Otherlobby17 (talk) 17:41, 11 May 2011 (UTC)
 * Not sure where you're getting "every representation $$\rho_i$$ has a very much non-trivial kernel, so none of these are faithful." Each block is an irreducible representation which may or may not be faithful.--RDBury (talk) 20:23, 11 May 2011 (UTC)
 * Any normal group intersects Z basically says if a representation is faithful on Z then it is faithful on G. I don't have a solution to the stated problem off the top of my head but I'm convinced that you should be looking at the eigenspace decomposition as well, particularly the eigenspace decomposition of the generator of Z.--RDBury (talk) 20:44, 11 May 2011 (UTC)
 * I see the bit that I missed now. Look at the eigenvalues of a generator of Z in ρ. Let Z have order pn. If none of the eigenvalues has order pn (in C) then they all have order dividing pn−1 and that means ρ isn't faithful on Z much less G. So one of the eigenvalues has order pn. In the decomposition of ρ into ρi the eigenvalues don't change, they just get rearranged into groups. So the eigenvalue we're interested in must be an eigenvalue of one of the ρi. This ρi is then faithful on Z and therefore faithful on G.--RDBury (talk) 21:24, 11 May 2011 (UTC)


 * That's brilliant: god only knows how long it would have taken me to figure that out myself! Looks like I need more practice. I think my mistake above was thinking mistakenly that the $$\rho_i$$ are giving a representation on the entire space V on which our G acts, but looking at it as a block matrix for $$\rho(g) $$ we would have that all but a small number of columns for that $$\rho_i(g)$$ 'block' within the bigger matrix for would be zero within the rows corresponding to that block, and therefore the $$\rho_i$$ would take almost everything except for a small number of columns to zero: in fact, I suppose what we're really saying is that $$\rho_i(g)$$ is only mapping onto a smaller-dimension space in the first place, so those zero parts in the matrix of $$\rho(g) $$ aren't really corresponding to part of the subrepresentation. I don't know if that made any sense but it is clearer in my head so thankyou!

I greatly appreciate you helping me out, you've alleviated a lot of my confusion! Thanks :-) Otherlobby17 (talk) 12:07, 12 May 2011 (UTC)

Probability problem.
Hi.I have a question about probability.

Imagine we drop a coin 100 times. the probability for each side is 1/2 so each side must be shown 50 times. now, if one side comes 51 times and the other 49 times; which conclusion is correct:

1.the probabilty doesn't tell us exactly how many times each side comes.

2.the conditions in which the experiment is done led to this result (for example the area of the coin is a little more on one side)

3.none of above

thanks. —Preceding unsigned comment added by Irrational number (talk • contribs) 18:56, 11 May 2011 (UTC)


 * The probability is 1/2 (or 50%) per drop. It is not "50% of the time you will get heads and 50% of the time you will get tails." -- k a i n a w &trade; 19:00, 11 May 2011 (UTC)
 * If you are surprised by this result, then you should conclude that you do not yet understand probability, and need to study it more if you wish to gain a stronger understanding. --COVIZAPIBETEFOKY (talk) 19:13, 11 May 2011 (UTC)
 * Since they didn't say it, the answer is #1. The probability being 1/2 doesn't mean you'll get exactly 50 heads and 50 tails out of 100 throws. Staecker (talk) 22:17, 11 May 2011 (UTC)


 * 1. If you toss a coin once it doesn't land half on one side and half on the other, it either lands on one side or the other. Dmcq (talk) 23:36, 11 May 2011 (UTC)


 * The question reflects a lack of understanding of the law of large numbers, and perhaps the best way to gain an understanding would be to read that article (or at least the first part of it; the later parts are technical). Looie496 (talk) 02:02, 12 May 2011 (UTC)


 * Also, the probability is 50-50 for a fair coin, but most coins are NOT fair. "Since the images on the two sides of actual coins are made of raised metal, the toss is likely to slightly favor one face or the other if the coin is allowed to roll on one edge upon landing." (see coin toss). --Mgm|(talk) 08:38, 12 May 2011 (UTC)


 * There is a saying "Dice have no memory" (the article it links too is maybe too technical for you). If you threw the coin 99 times, and got 49 heads and 50 tails, how can a dumb coin "know" what to get in the last throw? It can't. Regarding the exact number of realizations, statisticians can come up with confidence intervals, in this context it would be for example "if the result is 60 vs. 40 or more skewed, it is more than 95% likely that the probability is not 1/2" (I made up the numbers now but it is straightforward to calculate if you know the method). Jørgen (talk) 10:48, 12 May 2011 (UTC)
 * If p is the probability of heads, then the probability that p =/= 1/2 is probably 1 whatever results are observed. 86.179.117.4 (talk) 11:29, 12 May 2011 (UTC)
 * You're right, of course. I should have said "probability is less than 1/2". Or maybe even that's wrong and I didn't think about it enough and should strike everything. Sorry. Jørgen (talk) 14:02, 12 May 2011 (UTC)
 * This is a classic confusion in statistics. The real question isn't whether or not the actual probability is exactly 1/2, but whether it's close enough to 1/2 for ones purposes. A test of the sort that you describe would conclude that assigning a probability of 1/2 to the coin-toss would result in a bad model of the coin (or the result of the experiment was unusual). --COVIZAPIBETEFOKY (talk) 16:00, 12 May 2011 (UTC)
 * I find even the calculation of the probability that p is close to 1/2 (say 1/2 - t <= p <= 1/2 + t) a troublesome one. I don't see how to do it without assuming some prior distribution of p (prior to observing the results), which is kind of begging the question. Unless I am missing something...? 86.181.203.129 (talk) 17:30, 12 May 2011 (UTC)
 * Bayesians use a prior distribution on p, and update on the data to find a posterior. Frequentists say "if p was 1/2, the probability of finding a result as surprising or more would be x". -- Meni Rosenfeld (talk) 06:05, 13 May 2011 (UTC)
 * From probabilities follow probabilities, not certainties. Assuming that p is 1/2, and getting heads every time out of a hundred times, you can only say that the result is improbable, not impossible. Knowing a population you can say something about a sample. This is called deduction and this is what frequentists do. Knowing a sample you can say something about the population. This is called induction and this is what Bayesians do. Bo Jacoby (talk) 06:47, 13 May 2011 (UTC).