Wikipedia:Reference desk/Archives/Mathematics/2009 June 11

= June 11 =

Pooling related binomial statistics
So, the motivation is this: in an online game I play, when you attack an enemy you get a line-by-line summary of the battle ("You hit for X. Enemy hit for X. You missed." etc.). I know that each enemy has a particular miss chance, p, which is known only to the developers. However, various effects can adjust p by a known amount that differs in each battle, giving a modified miss chance p'. Given any single battle log, I can estimate p' (and by subtracting the modifier, p) and construct a confidence interval around it by counting the number of hits and misses, but generally these are too few to construct a particularly tight CI (usually hits + misses < 20).

If I have a large dataset of the number of enemy hits and misses, and the modifier for each battle, can I pool that data to get a good estimator for p? Confusing Manifestation (Say hi!) 01:05, 11 June 2009 (UTC)
 * You are saying there are enemies E1,E2,... shooting at you, and you get messages like "enemy hit" but it doesn't tell you which enemy hit you? If you have separate stats for each enemy then you should be able to add them up for all the battles. 207.241.239.70 (talk) 01:49, 11 June 2009 (UTC)
 * There's only one enemy at a time, and in battle you basically trade blows with them until one of you runs out of health. So for each battle, I have a log that's something like this:


 * Enemy miss chance + 20%
 * You did x damage to enemy.
 * Enemy missed.
 * You did x damage to enemy.
 * Enemy did x damage to you.
 * So if there weren't that change to the miss chance each time (I've seen it as low as 0 and as high as 30), I could just add everything up. If the battles went on a bit longer I'd probably be able to justify a normal approximation which would make everything additive, but instead I've basically got a bunch of B(ni, pi') where the ni, like I said, are typically less than 20. Confusing Manifestation (Say hi!) 05:48, 11 June 2009 (UTC)
 * Hi. Suppose you have a 20% increase of missing, and the probability of hitting is p.  And you get H hits and M misses.  Then the likelihood function for p is $$(p+0.2)^M(0.8-p)^H$$, assuming the hits and misses are independent of one another (also assuming that I've interpreted "+20%" correctly).   Then, suppose that the 20% turns into 30% and you have H2 hits and M2 misses.  Then we would have $$(p+0.2)^M(0.8-p)^H\cdot (p+0.3)^{M2}(0.7-p)^{H2}$$.  Maximizing this gives the maximum likelihood estimate.  HTH, Robinh (talk) 07:07, 12 June 2009 (UTC)
 * Aha, maximum likelihood. Of course! Thanks for the help, I'll see if I can follow it all. Confusing Manifestation (Say hi!) 08:26, 14 June 2009 (UTC)

Graphing complex functions online
Does anyone know of a good online resource for graphing complex valued functions defined on the complex plane? I'm looking for something that can make images like File:Complex zeta.jpg, but for simple functions like polynomials and Möbius transformations. Thanks in advance. -GTBacchus(talk) 13:52, 11 June 2009 (UTC)
 * Hi. The R programming language is free and includes methods for plotting complex numbers in the elliptic package, which deals with elliptic functions.  The examples in the online help files include  Möbius transforms (I think; I'll add some if not).  HTH, Robinh (talk) 06:57, 12 June 2009 (UTC)
 * Sadly, I'm looking for something I can use on a computer where I haven't got admin rights to install anything. At home, I've got a good graphing program, but I'm obliged to work in a lab at school where I'm stuck with no ability to install software. Thanks, though. -GTBacchus(talk) 15:11, 12 June 2009 (UTC)
 * If you have a Hex editor and understand how to use it to edit a BMP file then you have what you need to put together an image by specifying the colour of each pixel. I suggest trying this with a very small image that you can store as a .bmp file using the MS PAINT program found in every Windows OS. The PAINT program is convenient for drawing axes and text. Then you have a lot of pixels to paint... That will be hard work using only the hex editor so good luck. BTW memory sticks are inconspicuous and can easily carry software but you didn't hear that from us. Cuddlyable3 (talk) 17:19, 12 June 2009 (UTC)
 * I know how to write code, and I've done it, but if that's the hurdle, then this ain't gonna happen. At home, I just type in: $$f(z) = z^{1+i}$$, and boom: I get a graph where hue represents argument and saturation represents modulus. It would be cool if someone made an applet that did that, too. It only seems slightly more complicated than I'm motivated. -GTBacchus(talk) 20:07, 12 June 2009 (UTC)

Proof that the axiom of choice implies the well-ordering principle?
Proven math provides a proof of the fact in the title, but I can't for the life of me find a key to the notation used there. Can someone point me to such a key, or even save me the trouble of wading through all that heavy notation, and spell out the proof right here? Also, is there a reason that the proof cannot be found on wikipedia?

I appreciate it. --69.91.95.139 (talk) 20:43, 11 June 2009 (UTC)
 * Are you familiar with ordinal numbers? The simplest way to prove the well-ordering principle from the axiom of choice is via transfinite recursion over the ordinals. Algebraist 20:51, 11 June 2009 (UTC)


 * Yes, though I should probably hasten to add that I only know about them from what I've read here on wikipedia. Can you explain how one goes about proving that, for any set X, there exists a sufficiently large ordinal number α such that the members of X can be enumerated by all the ordinal numbers less than or equal to α? --69.91.95.139 (talk) 21:26, 11 June 2009 (UTC)
 * That is the well-ordering principle. You prove it by transfinite recursion: when you get to the ordinal β, choose an element x of X which you haven't already mapped anything less than β to and map β to x. Of course, you need the axiom of choice to make all those choices. The process must terminate at some ordinal α less than the Hartogs number of X. Algebraist 21:34, 11 June 2009 (UTC)
 * Thanks for the link. I myself was worried no such upper bound would exist. The proof in the Hartogs number was very illuminating. JackSchmidt (talk) 23:30, 11 June 2009 (UTC)

Ok, can I try a different angle here? Because unless there's something I'm missing with this transfinite recursion, I'm still not sure I accept the existence of sufficiently large ordinal numbers (though I promise I will as soon as I accept the well-ordering principle).

Suppose we are setting out to develop a well-ordering ≤ on the set X, using the choice function f on the power set of X. Let Y be a variable which we will use to keep track of our progress. To begin with, we define Y = X. At each step, we take the element of Y f(Y). We define x ≤ f(Y) for all x in X but not in Y, and similarly f(Y) ≤ x for all x in Y (and yes, I realize that just one of these definitions is sufficient; the other is redundant). Finally, define the new Y to be the old Y minus the chosen element f(old Y). Repeat the process. Now, in order to prove the well-ordering principle, we need only show that this process will somehow eventually end, with Y = ∅. What guarantee is there of that happening? --69.91.95.139 (talk) 22:08, 11 June 2009 (UTC)
 * The guarantee is the existence of Hartogs numbers. Algebraist 22:12, 11 June 2009 (UTC)
 * At a slightly more conceptual level, the way these things work is that if the process never terminated, it would mean you could get a (class) injection of the ordinals into a set. Then you pull it back and see that there has to be a set containing all the ordinals.  But that can't be, because of the Burali-Forti paradox. --Trovatore (talk) 22:18, 11 June 2009 (UTC)
 * Ah, I got it. I should have guessed there was a proof by contradiction just for that purpose.
 * Many thanks, --69.91.95.139 (talk) 23:27, 11 June 2009 (UTC)

There's a proof in Erich Kamke's book on set theory, reprinted by Dover. Michael Hardy (talk) 21:14, 11 June 2009 (UTC)