Wikipedia:Reference desk/Archives/Mathematics/2011 August 26

= August 26 =

Limit to infinity
If $$\lim_{x \rightarrow \infty}f(x)$$ exists, is it always true that $$\lim_{x \rightarrow \infty} f(x) = \lim_{x \rightarrow \infty} f(x+1)$$? Intuitively I would think so, but I'm not sure how I would prove it.Widener (talk) 06:43, 26 August 2011 (UTC)
 * Yes. This is easy to show using the definition of limit. -- Meni Rosenfeld (talk) 07:16, 26 August 2011 (UTC)
 * Same N ? Widener (talk) 08:12, 26 August 2011 (UTC)
 * That depends on what exactly you mean by N (that is, which limit it applies to). You can certainly use N2 = N1-1, but x>N ⇒ x>N-1. -- 110.49.248.138 (talk) 20:02, 26 August 2011 (UTC)

FLAW in Cauchy's proof for the inequality of arithmetic and geometric means?!
The article inequality of arithmetic and geometric means gives the following proof by Cauchy, for the subcase when n = 2.

$$ \begin{align} x_1 & \ne x_2 \\[3pt] x_1 - x_2 & \ne 0 \\[3pt] \left( x_1 - x_2 \right) ^2 & > 0 \\[3pt] x_1^2 - 2 x_1 x_2 + x_2^2 & > 0 \\[3pt] x_1^2 + 2 x_1 x_2 + x_2^2 & > 4 x_1 x_2 \\[3pt] \left( x_1 + x_2 \right) ^2& > 4 x_1 x_2 \\[3pt] \Bigl( \frac{x_1 + x_2}{2} \Bigr)^2 & > x_1 x_2 \\[3pt] \frac{x_1 + x_2}{2} & > \sqrt{x_1 x_2} \end{align} $$

How is the last step in this proof valid? Widener (talk) 08:18, 26 August 2011 (UTC)
 * By assumption, $$x_1$$ and $$x_2$$ are non-negative, so $$x_1x_2$$ and $$\frac{x_1+x_2}{2}$$ are non-negative. Square-root is an increasing function on the non-negative reals, so it preserves order.--121.74.96.240 (talk) 08:26, 26 August 2011 (UTC)
 * Okay, so generally, what is the proof that $$x^2 > y \Rightarrow x > \sqrt{y}$$ for $$x,y \ge 0$$ ? Widener (talk) 08:38, 26 August 2011 (UTC)
 * Instead of showing that, I'll show $$ x^2 > y^2 \Rightarrow x > y$$. Replace $$y$$ with $$\sqrt{y}$$ for what you want.
 * Now, suppose not. Then $$x \le y$$. So $$x^2 \le xy \le y^2$$.  Contradiction.--Antendren (talk) 08:55, 26 August 2011 (UTC)

Cumulative position probabilities as a result of tossing dice
Let's say you roll a die or dice and move a token forward by that number of positions. What's the probability of hitting each position, if multiple throws are allowed ? Let me give you an example with one die:

Chance of reaching position 1 = 1/6 (the chance of rolling a 1)

Chance of reaching position 2 = 1/6 (the chance of rolling a 2) + 1/36 (the chance of rolling a 1, twice in a row)

So, the chances of hitting each spot should go up with increasing distance from the starting spot, approaching a limit. Note that the chances, when added up, will be more than 100%, since the token will land on multiple spots. So, do we have a chart of these probabilities, for one die and for when rolling a pair of dice ? Also, what is the theoretical limit probability for the cases of one die or two dice ? StuRat (talk) 18:26, 26 August 2011 (UTC)


 * With one standard six-sided die, your token advances an average of 3.5 positions per roll, so the limit of its probability of landing on any one position far enough along is 2/7. -- 110.49.248.138 (talk) 20:20, 26 August 2011 (UTC)


 * Also note (again for one die), that the probability of the token landing on any one position is 1/6 times the sum of the probabilities of landing on the previous six positions (which, as you pointed out, will sum to greater than unity). This yields an easy to calculate algorithm.  For my HP-48G, I wrote the program << 6 DUPN + + + + + 6 / >>, seeded the stack with 0 0 0 0 0 1, and ran the program repeatedly, yielding:

1: 0.166667 2: 0.194444  3: 0.226852  4: 0.264660  5: 0.308771 HI  6: 0.360232 7: 0.253604 LO 8: 0.268094 9: 0.280369 10: 0.289288 11: 0.293393 HI 12: 0.290830 ... 38: 0.28571632 39: 0.28571480 40: 0.28571365 ...
 * This agrees with your probabilities for positions 1 and 2, and approaches the limit of 2/7 ≈ 0.28571429. Note that 6 is the most likely position to land on. For two dice, you will have to weight your average appropriately, but it is easy to see that the limit is 1/7. -- 110.49.227.131 (talk) 20:59, 26 August 2011 (UTC)


 * Interesting. I was wondering if it would behave like this, first rising above the limit, then dropping below, and continuing to bounce back and forth at ever reducing magnitudes as it approaches the limit.  If you'd be willing to do the same for 2 dice, I'd like to see those results, too. StuRat (talk) 21:30, 26 August 2011 (UTC)


 * Sure. For two dice the odds of throwing a 1, 2, ..., 12 are, in thirty-sixths, 0, 1, 2, 3, 4, 5, 6, 5, 4, 3, 2, 1.  So my quick and dirty HP-48G program becomes << 12 DUPN 0 * SWAP 1 * + SWAP 2 * + SWAP 3 * + SWAP 4 * + SWAP 5 * + SWAP 6 * + SWAP 5 * + SWAP 4 * + SWAP 3 * + SWAP 2 * + SWAP 1 * + 36 / >> (I suppose that I really should be using a loop here), and the stack is initialized to 0 0 0 0 0 0 0 0 0 0 0 1.

1: 0  2: 0.027778   3: 0.055556   4: 0.084105   5: 0.114198   6: 0.146626   7: 0.182227 HI   8: 0.166346 9: 0.155526 10: 0.147784  11: 0.141275  12: 0.134199  13: 0.124704 LO  14: 0.138567 15: 0.145771 16: 0.148355 HI  17: 0.147890 18: 0.145672 19: 0.142886  20: 0.140764  21: 0.140745 LO  22: 0.141558 ... 48: 0.142850  49: 0.142855  50: 0.142859  ...
 * peaking on Lucky Number 7 and with the limit of 1/7 = 0.142857... . I am curious about the period of the oscillations about the limit -- whether and how it changes as the values approach the limit.  My sense is that the period remains fairly steady at between 8 and 9. I'm sure that this type of hunting behavior is well studied in numerical analysis. -- 110.49.235.126 (talk)  —Preceding undated comment added 02:35, 27 August 2011 (UTC).


 * Thanks. I hope you don't mind, but I went ahead and labelled the HI and LO points above.  The distances between them seem rather chaotic.  I wonder if they smooth out more later on. StuRat (talk) 05:50, 27 August 2011 (UTC)


 * (Edit Conflict) I don't mind at all. I've added a couple of extra points to the lists so that they end one point after an extremum, moving both of your "questioned" labels, but they are still too short to gather more than an impression on initial behavior, and they are probably approaching the point of being too long to post on this ref desk. My statement about the 8-9 period was based on casual observation of positions out in the range of 40 - 60, but I really need to run this on a computer to get better analysis -- and to avoid the hassle of copying figures over from a calculator. I'll do a bit more work on this tonight, and contact you via your talk page. -- 110.49.225.10 (talk) 07:12, 27 August 2011 (UTC)


 * Generating functions can often be used to investigate such questions. In the first case, if the denominator is taken to be 6n, then a generating function for the numerator is given by 1/(1-x-6x2-62x3-63x4-64x5-65x6. The denominator this function has a factor of 1-6x. The limit of probability can be found using partial fractions: the generating function can be written (2/7)/(1-6x)+(another term) and as long as the roots (real and complex) of the denominator of the other term have absolute value greater than 1/6, the 1-6x term dominates. I don't see an obvious way of proving this, though it's apparently the case given the table above. Note that the anon's proof above that the limit is 2/7 has a subtle flaw in that it assumes the limit exists in the first place. The analysis of two die case can be done similarly but is more complex.--RDBury (talk) 07:01, 27 August 2011 (UTC)
 * Wouldn't another approach be to consider the numbers in batches of 6 (or 12 for the two dice case), and construct a matrix A that calculates the probabilities of the next 6 given the probabilities of the current 6 (this matrix would be fairly complex, since you'd have to consider up to 6 tosses of the die)? Then hope it's diagonalizable, and if so, compute the limit.--Antendren (talk) 08:32, 27 August 2011 (UTC)
 * That's a great idea! We can even simplify the matrix by instead calculating the 6 position probabilities counting back from the very next position, given the 6 counting back from the current position. Thus, your A is my M6. For the one die case:

[ 1/6 1/6 1/6 1/6 1/6 1/6 ]     [ 1 ]     [  1   0   0   0   0   0  ]      [ 0 ]     [  0   1   0   0   0   0  ]      [ 0 ] M = [  0   0   1   0   0   0  ], S = [ 0 ]. [ 0   0   0   1   0   0  ]      [ 0 ]     [  0   0   0   0   1   0  ]      [ 0 ]
 * And for the two dice case:

[ 0   1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 ]      [ 1 ]     [  1    0    0    0    0    0    0    0    0    0    0    0   ]      [ 0 ]     [  0    1    0    0    0    0    0    0    0    0    0    0   ]      [ 0 ]     [  0    0    1    0    0    0    0    0    0    0    0    0   ]      [ 0 ]     [  0    0    0    1    0    0    0    0    0    0    0    0   ]      [ 0 ]     [  0    0    0    0    1    0    0    0    0    0    0    0   ]      [ 0 ] M = [  0    0    0    0    0    1    0    0    0    0    0    0   ], S = [ 0 ]. [ 0    0    0    0    0    0    1    0    0    0    0    0   ]      [ 0 ]     [  0    0    0    0    0    0    0    1    0    0    0    0   ]      [ 0 ]     [  0    0    0    0    0    0    0    0    1    0    0    0   ]      [ 0 ]     [  0    0    0    0    0    0    0    0    0    1    0    0   ]      [ 0 ]     [  0    0    0    0    0    0    0    0    0    0    1    0   ]      [ 0 ]
 * Then the probability of landing on position n is the top value of Mn S, which, given S, is the top left value of Mn. Note that the column vectors:

[ 2/7; 2/7;  2/7;  2/7;  2/7;  2/7]
 * and

[ 1/7; 1/7;  1/7;  1/7;  1/7;  1/7; 1/7;  1/7;  1/7;  1/7;  1/7;  1/7]
 * are fixed points under their respective M -- that is, they are eigenvectors with eigenvalue 1. Does that help? (That would seem to be a necessary condition for them to be limits, but not necessarily sufficient.) -- 110.49.242.138 1(talk) 12:17, 27 August 2011 (UTC)


 * The eigenvectors above are overspecific, given that any scalar multiple of an eigenvector is also an eigenvector. So what I should have pointed out is that the "one-vector" (sometimes called the "summing vector")

[1; 1; ...1]
 * is an eigenvector with eigenvalue 1 for any such matrix with a finitely supported probability distribution, given in block form:

p_1 ... p_n-1 | p_n --+              |       I       |  0 |


 * where sum(p_i) = 1. So the point should have been that fixed points exist, not that the specific limit alone is a fixed point. -- 110.49.241.90 (talk) 23:44, 28 August 2011 (UTC)

(outdent) In response to a user talk page question, I'll explain the above matrix here as it is just a formalization of the method of calculating the landing probability on the next square given the the probabilities for the previous squares. Using RDBury's notation from the follow-up question, let {pi} with i>0 be a probability distribution where pi is the likelihood of advancing i positions on any one move. If p has finite support, then there is an n>0 such that pn>0, pi=0 whenever i>n and ∑undefinedn pi = 1. For the one die case, pi = 1/6 for 1≤i≤6 and 0 otherwise, with n=6. Let qi be the probability of landing on position i. q0=1 indicates that there is a single starting position. If qi are known for all i<k, then qk may be computed by summing the probabilities of moving directly to that position from all previous positions. The probability of moving directly to position k from position k-i (i positions back) is qk-i ⋅ pi, the product of the probability of landing on position k-i in the first place and the probability of moving i positions in one turn. Since the largest possible move is n, qk = ∑undefinedn qk-i ⋅ pi. As Antendren pointed out, knowledge of the landing probabilities of only the previous n locations is required for the computation, so if these are represented in a column vector [qk-1; qk-2; ...; qk-n], we could then multiply (on the left) by the matrix M, given in block form above, to yield [qk; qk-1; ...; qk+1-n]. Note that multiplying against the probabilities in the top row of the matrix yields the above formula for calculating qk, and the offset identity block shifts the previous landing probabilities down one position in the resulting vector, ready for the next calculation. Whether or not mathematical analysis of the matrix is useful, it does describe what is done numerically in calculating the landing probabilities, even if a program takes advantage of the nature of the matrix and doesn't do the full-blown computation. -- 110.49.241.63 (talk) 02:00, 31 August 2011 (UTC)


 * That the roots of the denominator of the second term have absolute value at least 1/6 can be proved with winding numbers. If z traces the circle |z|=1/6, given by z=1/6(cos t + i sin t), then the curve traced by the denominator is given by w=1 + 5/6 cos t + 4/6 cos 2t + 3/6 cos 3t + 2/6 cos 4t + 1/6 cos 5t + i(5/6 sin t + 4/6 sin 2t + 3/6 sin 3t + 2/6 sin 4t + 1/6 sin 5t). I claim that the real part of w is always at least 1/2, in fact 1/2 + 5/6 cos t + 4/6 cos 2t + 3/6 cos 3t + 2/6 cos 4t + 1/6 cos 5t = 1/6(cos 5/2t + cos 3/2t + cos 1/2)2 as can be shown using various trigonometric identities. Then the curve traced by w stays to the right of the imaginary axis and therefore its winding number around 0 is 0, implying that there are no roots of the polynomial inside the original circle.--RDBury (talk) 08:25, 27 August 2011 (UTC)

OK, thanks for the feedback, everyone. 110.49.225.10 placed 100 lines of output for the 1 die case (and 150 lines of output for the 2 dice case) on my talk page. StuRat (talk) 07:21, 28 August 2011 (UTC)
 * For what it's worth, you can get an asymptotic solution using generating functions. Let
 * $$\phi(x) = \tfrac{1}{6} (x + x^2 + x^3 + x^4 + x^5 + x^6)$$
 * Then, for the 1-die case, the generating function is:
 * $$\frac{1}{1-\phi(x)}$$
 * For the 2-die case, the generating function is:
 * $$\frac{1}{1-\phi(x)^2}$$
 * In any case, if F(x) is a generating function of a probability distribution with positive integer values, then the generating function in the general case is
 * $$\frac{1}{1-F(x)}$$
 * If F(r) has no other solutions with |r|= 1 (i.e., if gcd({n | pn > 0}) = 1), then the leading term in partial fractions is:
 * $$\frac {1}{F'(1)} \frac {1}{1-x}$$,
 * which means the limit is:
 * $$\frac{1}{F'(1)} = \frac{1}{E(X)}$$
 * Clearly, if F is the generating function, then 1&minus;F has no roots with absolute value less than 1. Consider the absolute values of the roots of F'.  If there is a unique root with the next smallest absolute value and it's negative (-r), then the next term in the asymptotic expansion of the hit probabilities qn'' is:
 * $$\frac{1}{F'(-r)} {(-r)}^{-n}$$
 * and the series eventually alternates with values above and below.
 * If the next root is a complex pair $$r e^{\pm i \theta}$$, then a similar calculations (which I will not go into in detail) shows the next term in the expansion of qn would be
 * $$C r^{-n} \cos(\alpha + n \theta)$$
 * with a quasi-period likely to be
 * $$\frac{2 \pi}{\theta}.$$
 * I don't have access to my C library at this time, nor to a full algebraic calculation system, so the application of this to the specific 1- and 2-die systems is left as an exercise to the reader. — Arthur Rubin  (talk) 00:37, 29 August 2011 (UTC)

See also the followup problem. -- 110.49.234.163 (talk) 00:01, 2 September 2011 (UTC)

I have a question about fractals.
My question is does anyone know if fractals has found any applications in psychology? I was asking this because I took a random system of thought (Theology, actually); and it seems to be based on a fractal. Please let me know. Thanks!  Lighthead  þ 22:49, 26 August 2011 (UTC)

By the way I realize that Theology is not exactly quantifiable. Roughly speaking.  Lighthead  þ 22:51, 26 August 2011 (UTC)

Let me explain so that I don't look like a complete idiot. Let's take Judeo-Christian Philosophy. God (1). Angels (.5) Humans (.25) Animals, perhaps (.125) And so on.  Lighthead  þ 23:00, 26 August 2011 (UTC)
 * Keep explaining- I still don't see anything like fractals in what you're describing. (And I don't know if fractals have applications in psychology- I doubt it.) Staecker (talk) 23:21, 26 August 2011 (UTC)


 * Well maybe I don't understand fractals then. I'm not mathematical. And thanks for doubting, I'm sure that if people would've doubted in the first place the idea of fractals would've gotten off the ground. Mandelbrot was very popular. Especially in his views.  Lighthead  þ 23:52, 26 August 2011 (UTC)


 * The concept of a fractal is that a part of the system has the same structure as the whole system, but at a smaller scale. For your example to represent a fractal, humans would have to be parts of God and also have the same structure as God -- in Christian theology humans may be considered to have the same form of God, but they are not considered parts of God.  So that isn't really a fractal in any sense that I understand.  I'm not aware of any applications of fractals to psychology.  (There are claims to have found fractal structure in brain electrical activity, if that can be counted as psychology.) Looie496 (talk) 00:16, 27 August 2011 (UTC)


 * Well, as for religions, perhaps the "repeating the same pattern at different scales" part may apply to some time scales in the Hindu and native Central, and South American religions, in particular. I believe they have the "cycles within cycles" concept as central parts of their religions. StuRat (talk) 00:39, 27 August 2011 (UTC)


 * As to the people that thought that I was saying nonsense; I meant Theology from the socio-psychological standpoint. That's actually why I gave the disclaimer that I realize that God is not quantifiable. I don't believe that God is a construct, but I was attacking it from the standpoint that those from the scientific standpoint at least objectively see it as a construct. I think that StuRat came close to understanding what I'm talking about. And by the way it's not too far fetched to see psychological processes as mathematical. How do you explain game theory, and how they've found the actions of terrorist cells and dictatorial governments to follow mathematical principles? People like Staecker should avoid the knee-jerk reaction of treating everything that people say as nonsense. I might not be very mathematical, but I do have a capacity to think theoretically.  Lighthead  þ 03:00, 27 August 2011 (UTC)


 * I actually don't want to get too far into Theology because that was just an example; but even under the Judeo-Christian perspective, God is said to be a part of everything and therefore a part of us (from what Looie496 was saying). And also to what Looie496 was saying, yeah, electrical brain activity would count from what I'm saying. Electrical brain activity has been researched as perhaps an indicator of what people are thinking. That seems to answer most if not all of what I'm saying.  Lighthead  þ 04:12, 27 August 2011 (UTC)