Wikipedia:Reference desk/Archives/Mathematics/2011 April 2

= April 2 =

Lagrangian problem
Dear Wikipedians:

I have been working on an interesting problem using Lagrangians:

Does the function f(x,y,z) = x² + y² + z² + $$\frac{1}{x^2 + y^2 + z^2}$$ have any extrema? if so, what are they and where are they located.

In order to use Lagrangians, I have used the following constraint:

x² + y² + z² = c

where c>0 is a real constant. So basically the constraint allows a spherical plane to swipe over the entire solid Euclidean space.

I then setup the following lagrangian:

$$f(x,y,z,\lambda) = c + \frac{1}{c} - \lambda(x^2 + y^2 + z^2 - c)$$

Setting $$F_x = F_y = F_z = F_\lambda = 0$$, we have:

$$\begin{cases} -2\lambda x = 0 \\ -2\lambda y = 0 \\ -2\lambda z = 0 \\ x^2 + y^2 + z^2 - c = 0 \end{cases} $$

which yields the solution x = y = z = 0

but f(0, 0, 0) = DNE

So I conclude that there is no extremum of any kind.

Would my solution above look reasonable?

Thanks,

70.31.159.218 (talk) 16:12, 2 April 2011 (UTC)


 * This would look like homework (an "interesting problem" -- interesting how?) were it not that there seems to be no reason at all to use Lagrange multipliers for the original problem. It is immediately clear from the definition that f depends only on the distance from the origin, assumes a minimum value of 2 when the radius is 1, and can have any larger value. Since f is constant on any origin-centered sphere, when you constrain the problem to such a sphere (with which justification and for which purpose?), of course it collapses into nothingness.
 * Why do you want to use Lagrange here? Even if we ignore the pulling-a-constraint-out-of-a-hat aspect, all you achieve is to replace a critical-point search in 3 variables with one in 4 variables. That's not progress! –Henning Makholm (talk) 17:42, 2 April 2011 (UTC)


 * Thanks Henning for all your help. So by inspection we see the minimum. But what I am wondering is if there is a systematic way of arriving at the same conclusion, through the use of perhaps Hessians? Somehow I feel "by inspection" is not a very convincing argument to use in mathematics. L33th4x0r (talk) 22:17, 3 April 2011 (UTC)


 * Inspection is a perfectly cromulent argument to use in mathematics. If the thing you claim is true, it is true independently of whether you found it by divine revelation, inspired guesswork, or a systematic procedure. It's only in particular educational situations you may be asked to use a specific procedure, and then always because getting experience with that procedure was the real goal of the exercise and finding the answer is just a macguffin. Out in the real world where answering the question is the goal, anything goes so long as you can convince yourself and others somehow that it is correct.
 * I'm not sure what would count as a fully systematic procedure. When you apply your Lagrangian multiplier, you end up having to solve 4 equations in 4 variables. How did you do that? In your postings so far, you've just quoted a result. Why not use that same method to solve the 3 solutions in 3 variables to find critical points of the original function directly? –Henning Makholm (talk) 22:39, 3 April 2011 (UTC)
 * Thanks Henning for all your help. I have learned much from you. L33th4x0r (talk) 22:33, 4 April 2011 (UTC)


 * What exactly do you mean by "Lagrangians"? The mathematician Joseph Louis Lagrange has lent his name to many areas of mathematics. One of the most powerful and fashionable areas is symplectic geometry and symplectic topology. Lagrangian multipliers are just a small contribution of a great mathematician. Fly by Night on Tour (talk) 02:09, 3 April 2011 (UTC)

Isomorphic graphs
My question is whether the two graphs are isomorphic? In general, how do we find the isomorphism is it exists. I tried writing the adjacency matrix and then permuting the rows but it is too big to keep track of. Thanks- Shahab (talk) 16:16, 2 April 2011 (UTC)


 * Graph isomorphism seems to be a hard problem—nobody knows an efficient algorithm that works for all graphs. There are some algorithms that work pretty well in practice, though. One software package that implements such an algorithm is called nauty (No AUTomorphisms Yet?), though nauty is a little cryptic to figure out. —Bkell (talk) 16:57, 2 April 2011 (UTC)


 * "seems to be" a hard problem? So no one has proved that it's NP-hard or NP-complete or something like that? Michael Hardy (talk) 03:01, 3 April 2011 (UTC)


 * Yes. Our article on the Graph isomorphism problem says a bit about this. In particular, it is not known to be solvable in polynomial time, but it is also not known to be NP-hard. Invrnc (talk) 03:27, 3 April 2011 (UTC)


 * There's a lot of edges there. It may be easier to get a handle on the problem if you consider the complement graphs instead. –Henning Makholm (talk) 17:00, 2 April 2011 (UTC)


 * Here's a hint for your particular example, Shahab: Consider 5-cycles in those graphs. —Bkell (talk) 17:09, 2 April 2011 (UTC)
 * … Or use Henning's suggestion—that's actually a lot easier. —Bkell (talk) 17:11, 2 April 2011 (UTC)


 * I don't know if this qualifies under easier or whatnot, but for this case, the cleanest proof (imo) is just to note the automorphism groups of the two graphs are not isomorphic (consider cyclic permutations of the vertices around the outer rings). Invrnc (talk) 03:31, 3 April 2011 (UTC)
 * I lied, didn't actually look at the graphs for very long. Silly me. Invrnc (talk) 03:33, 3 April 2011 (UTC)


 * Thanks. I see that the 1st graphs complement is 2 4-cycles and the second graphs complement is a 8-cycle.-Shahab (talk) 04:45, 3 April 2011 (UTC)

Rotation of a plane
How do I rotate a general plane ax+by+cz=d? --70.244.234.128 (talk) 19:24, 2 April 2011 (UTC)


 * Switch to a coordinate system where the plane has a simpler equation, such as px=q, rotate around the x-axis there, and switch back to your original coordinates. You can find a suitable basis by starting with the normal vector (a,b,c) and extend it to a orthonormal basis by the Gram-Schmidt process (or more elementarily, just normalize (a,b,c), take the cross product with a random non-parallel vector to get something at right angles to that, then normalize the result and let the third basis vector be the cross product of the two first ones). Afterwards, you can multiply the two basis change matrices together with the rotation matrix to get a single matrix that will do the rotation for you in one go. –Henning Makholm (talk) 05:39, 3 April 2011 (UTC)

First focus on the case when the plane passes through the origin (so $$d=0$$). Let $$\mathbf u=(a,b,c)/\sqrt{a^2+b^2+c^2}$$ be the unit normal to your plane. Let $$R(\theta)$$ be the operator that rotates a vector in the plane through an angle &theta; in radians (when viewed from the tip of u), and let x be a vector on the plane. As &theta; varies, the curve $$R(\theta)\mathbf x$$ sweeps out the circle through x centered at the origin, with unit angular speed. Hence the tangent vector to $$R(\theta)\mathbf x$$ has the same magnitude as x, and is in a direction perpendicular to $$R(\theta)\mathbf x$$. Thus $$R(\theta)$$ satisfies the differential equation $$R'(\theta)\mathbf x = \mathbf u\times R(\theta)\mathbf x$$, the choice of sign on the right side being justified by the right-hand rule. We can solve this by a series (and verify by inspection that the solution is correct), giving
 * $$R(\theta)\mathbf x = (\cos\theta)\mathbf x + (\sin\theta)\mathbf u\times \mathbf x.$$

For the general case, when $$d\not=0$$, let x be on the plane and $$R(\theta)$$ as above. Note that the position vector $$\frac{d}{\sqrt{a^2+b^2+c^2}}\mathbf u$$ lies on the plane and is parallel to u. Thus the required rotation is
 * $$R(\theta)\left(\mathbf x - \frac{d}{\sqrt{a^2+b^2+c^2}}\mathbf u\right) + \frac{d}{\sqrt{a^2+b^2+c^2}}\mathbf u.$$

-- Sławomir Biały (talk) 11:55, 3 April 2011 (UTC)