Talk:Intermediate treatment of tensors

Comments following inital split-out as a separate article
Sorry to be annoying, but this article have several problems: the entry for invariant say: "An invariant is something that does not change under a set of transformations". If we expand this would give something like "A tensor is a multi-dimensional transformation which doesn't change under a set of transformation" !@?
 * 1) the sentence: "A tensor is an invariant multi-dimensional transformation" is very unclear:
 * 1) It is not true that tensor are always transformation (i.e. classical mass: tensor of rank(0/0); how this can be considered as a transformation?)

What I think this article should be: P.S. I'm not a tensor specialist myself, so ... ---
 * 1) Start by explaining that a tensor is a generalization of the concept of vector and matrices.
 * 2) Then explain that tensors allow to express physical laws in a form that apply to any coordinate systems.
 * 3) Say that tensors are heavily used in Continuum mechanics and Theory of relativity (beacause of the previous point)
 * 4) Introduce the two species: contravariant/covariant, introduce notation and ranks (~= number of indices).
 * 5) Define the contravariant/covariant component by showing how they transform under a change of coodinate system.
 * 6) Special cases:
 * 7) tensors of rank(0/0) => scalars,
 * 8) rank(1/0) => vectors in differential geometry or contravariant vectors in tensor analysis,
 * 9) rank(0/1) => one-forms in differential geometry or covariant vectors in tensor analysis.
 * 10) Give some example: Curvature tensor, Metric tensor, Stress-energy tensor

This article is confusing. First statement: tensors generalise vectors and matrices. Fine: a vector is a 1-dimensional array of numbers, a matrix is a 2-dimensional array of numbers. The only obvious generalisation is to an n-dimensional array of numbers, n integral, possibly > 2. Is that what a tensor is? No.

Second statement: "A tensor is an invariant multi-dimensional transformation". So, it's a transformation? Well, as far as I know, a vector and a matrix can represent a transformation, for example, of coordinates. I'm not sure that representation is identity; however, perhaps this is just quibbling. Provided the representation is unique, that may do for practical purposes.

Definitions: To define $${T}^i$$, we get an equation relating $$\bar{T}^i$$ and $$\bar{T}_i$$. Where is the definiend? Supposing that was meant to be $$\bar{T}^i$$, the term $$\bar{T}_i$$ on the right is undefined. And why must a transformation necessarily take the form of a partial derivative? Similar problems afflict the second definition.

Yahya Abdal-Aziz - 2003/04/29. -

I agree with the two sentiments expressed above, this article is very unclear. I think it is best to merge it with tensor, and redirect. AxelBoldt 17:18 May 1, 2003 (UTC)

The reason it was split from 'tensor', is that it is a different treatment. Merging the two would be like merging 'Marxism' with 'Plato's Republic'. It would be very confusing, and also would make for an excessively long article. If this article is unclear, then the last thing it needs is to be entangled in a mass of ideas related to another procedure altogether, and especially one which is currently far more unclear.

Instead of trying to push a political agenda, maybe we should attempt to clearly present the information, with a focus on developing concepts( as opposed to constructing a rigourous self-referential mathematical soup).Kevin Baas

I don't understand where your reference to a political agenda comes from. The modern and the classical approach are really talking about the same thing, but from different angles. They need to be explained in the same article so that the reader sees how and why they are about the same thing. By the way, the coordinate approach is already mentioned in tensor, just not very prominently. AxelBoldt 23:20 May 1, 2003 (UTC)

The political agenda reference was not pointed at you. Since the classical method is being deprecated, there are some who think that it should be quit altogether. It is very difficult, however, to learn the modern approach without a developed geometric intuition, which the classical approach does a good job of establishing. For this, and other reasons, and esp. that this is an Encyclopedia, information on the classical approach should not be withheld or repressed, even if it's usage is 'defered'. There used to be a classical approach presented in the tensor section, but it was 'phased out'; replaced.

You obviously are in agreement with me that both methods must be presented. We are, then, only in disagreement on a more subtle point: whether the two treatments( which are, ofcourse, about the same thing) should be presented simultaneously or in parrallel. I would image that an encyclopedia in book form would present one in whole, followed by the other in whole, and would not entangle them. If the reader cannot clearly identify their equavalency, which is explicitly stated, clearly marked, and geometrically neccessary, then it is clear that they have not geometrically comprehended the material, and thus the manner of presentation is failing. It is of little practicality for one to recognize the equivalence of two things that they cannot understand. However, once they understand them, the equavalence is obvious and trivial.

--Kevin Baas 2003.05.02

Import from PlanetMath
This article text has mostly been replaced by material adapted from the PlanetMath GFDL article on tensors.

Credit: An earlier version of this article was adapted from the GFDL article on tensors at http://planetmath.org/encyclopedia/Tensor.html from PlanetMath, written by Robert Milson and others

Here are some PlanetMath to Wikipedia TeX transformations:


 * \cU should be $$ \mathcal{U} $$
 * \bv should be $$ \mathbf{v} $$
 * \halpha should be $$ \hat{\alpha} $$
 * \ve should be $$ \varepsilon $$
 * \hve should be $$ \hat{\varepsilon} $$

Broken equation
This equation does not work at the moment, so I will paraphrase it for now. Moving it here for future reference when \overbrace is fixed:


 * $$\mathcal{U}^{p,q} = \overbrace{\mathcal{U}\otimes\ldots\otimes\mathcal{U}}^{p\mbox{

times}} \otimes \overbrace{\mathcal{U}^*\otimes\ldots\otimes\mathcal{U}^*}^{q\mbox{ times}}.$$

-

Note: bold alphas do not seem to render properly. The Anome 12:13 2 Jul 2003 (UTC)

Rewritten
ACK! What happened!?! This is unreadable now! The page used to be very clean. Now not only is it bloated, and poorly formatted, but the modern treatment is mixed up in it. No!!! This is completely inaccessible! Who did this!? It must be undone. I'm sure there may be some good pieces in this new part, but that doesn't justify all the other crap. Whoever it was, please revert, or someone else will have to go thru the trouble of doing so. And be a little more considerate and less self-righteous next time. Pay more attention to presention, rather than brute information. Kevin Baas

Also, keep in mind that this is the starting page for the classical treatment, not the final page. You write as if they already knew everything about tensors, in which case, why the hell would they be reading this? Kevin Baas


 * This page is now renamed Intermediate treatment of tensors, and your article is back at Classical treatment of tensors -- please hack away. -- Anon.

Rank of a tensor
It can't be true to say that a rank n tensor is 'simply' a tensor product of rank 1 tensors. In some or other sense it may be a linear combination of that kind of product.

Charles Matthews 09:41, 5 Nov 2003 (UTC)

I think the point is not that an n-rank tensor is neccesarily a tensor product of rank 1 tensors, but rather that one can construct an n-rank tensor in this fashion. It should be intuitively clear that, since a tensor product increases the number of neccessary dimensions to specify the resultant tensor, the relation does not, in general, work the other way around. If this is not intuitive to the general reader, however, then perhaps a clarification should be appended.

-- Kevin Baas 07:59, 24 Mar 2004 (UTC)

Superscripts and subscripts
There is a small, recurrent contradiction in this article. If a (p,q) tensor has contravariant rank p and covariant rank q, then the subscript and superscript indices have been mixed up. It should be [i1, ..., ip] in the superscript and [j1, ... , jq] in the subscript, right?

--Anon 19:00, 10 Apr 2004 (EST)


 * I've always understood the prefixes contra- and co- to refer to a relationship between two things. I've understood contravariant to be a relationship between two tensors (not an absolute property of one), such that one of the tensors is a superscripted and the other is subscripted, resulting in an inner product operation (which would involve a metric tensor in Riemmanian geometry).  Covariant, on the other hand, would be where both tensors are superscripted or both are subscripted, resulting in something like a diffeomorphism.  Or an outer product, depending on whether indices are shared between (or among) the tensors. Kevin Baas 18:05, 11 May 2004 (UTC)

Page copied
This page has bee stolen by

http://encyclopedia.thefreedictionary.com/Tensor-classical

- is there anything the wikipedia community can do about this? —Preceding unsigned comment added by 203.10.231.231 (talk • contribs) 01:15, 11 May 2004
 * No, information here can be freely copied.

Degrees of freedom of a tensor
Is "the total number of degrees of freedom required for the specification of a particular tensor"
 * "the dimension of the tensor raised to the power of the tensor's rank."

or
 * "the dimension of the tensor times the tensor's rank."?

Surely if the tensor is the tensor product of lots of rank-1 tensors, each with d.o.f. = dimension, it's the latter? On the other hand, if we think about the representing array, it's the former, but am I missing something here? -- The Anome 16:22, Jun 2, 2005 (UTC)

Rereading the article, it surely must be the latter whichever way we view it, that is, $$n(p+q)$$, not $$n^{(p+q)}$$? Or have I completely gone round the twist? -- The Anome 16:35, Jun 2, 2005 (UTC)


 * So, in relativity say, dimension = 4 (space-time dimensions). A curvature tensor has four indices. What does that mean? It means we have 4x4x4x4 components to account for. The exponential is right, therefore. Charles Matthews 16:52, 2 Jun 2005 (UTC)

discussion at Wikipedia talk:WikiProject Mathematics/related articles
This article is part of a series of closely related articles for which I would like to clarify the interrelations. Please contribute your ideas at Wikipedia talk:WikiProject Mathematics/related articles. --MarSch 14:11, 12 Jun 2005 (UTC)

"Beginners Mind"
I suspect that I am the target audience for this (and similar) articles. So, I thought I would share my perspective with you.

Before I get into all of that, let me say how much I appreciate that "you people" (the smart guys who are writing all of this stuff) have taken the time to share their knowledge in this forum. This is truly a fantastic resource, and demonstrates how the Internet can be used for a great and constructive purpose.

So that you can understand where I am coming from, I will give you a little background on myself. Many years ago, I got a BS in engineering, but I don't work professionaly as an engineer. Since I was a kid, I have been fascinated by science (primarily physics) and have read many popular accounts of physical theories (relativity, QED, etc). In fact, I am currently reading Roger Penrose's book (Road to Reality). Trying to understand some of the concepts in that book is what brought me to this page.

In any event, I have a basic background in math and science (a couple of years of calculus, diff. eq., classical physics, etc.) but nothing very advanced. For example, while I have heard the terms, I never studied manifolds, eigenvalues, tensors, groups, etc. So, that is where I am. Would I not be your target audience?

Now, on to my comments (which apply to all of the entries on tensors):

(1) I would urge you to present some very simple and "intuitive" example of tensors (or whatever concept you are trying to explain) early in the non-technical introduction. For example, the page on eigenvectors presents a marvelous and very intuitive example (the "face") early in the presentation. If a tensor is "like" a matrix, show me a tensor, and the corresponding matrix. How do I get from the tensor to the matrix, etc.

(2) I would suggest that you gently introduce other highly technical supporting subject matter by limiting the number of cross links early in the discussion. Of course, when the subject itself is complex (like tensors), there is obviously going to be certain prerequisite areas of knowledge. Perhaps it would be best to explicitly state that the reader should be familiar with x, y, and z (with supporting links) in order to understand the following discussion. I would keep this list as short as possible.

(3) I know that this is a somewhat nebulous concept, but I would try to cultivate a "beginners mind" when sharing your knowlege. In other words, keep in mind that a lot of us have no idea of what you are talking about once you get beyond basic algebra (especially in reference to mathematical formalisms). For example, the "dual vector space U*" is introduced with no explanation and no cross-link. I still don't know what this is.

(4) Flesh out the sections on notation. Half the battle is just understanding the notation. For example, the main tensor page has a section on the various notations. I assume that the "Abstract Index Notation" is the "main" notation used now days, because it is listed first. But when you go to that stub, it doesn't actually show you the notation at all. It just says that Penrose invented it and it is "better." Similarly, if you follow the next link to the Einstein notation, you learn about Einstein's summation convention, but nothing about the basic tensor notation. Finally, after a lot of jumping around, I found the "intermediate" page, which gives the best explanation of the basic notation (although I am still scratching my head after following the links to the covariant and contravariant links).

(5) I do think that the whole tensor section has gotten a little too difuse and disconnected (with the whole "classical," "intermediate" and "modern" approaches). Perhaps this is necessary, but it makes things rather confusing when you are first approaching the subject like I am.

I know that these must be difficult concepts to convey, and I appreciate your efforts. —The preceding unsigned comment was added by Kyletownsend (talk • contribs) 05:10, 2006 November 2.


 * There was plenty of noisy discussion years ago. This and the other pages formed would now be called 'POV forks', a somewhat deprecated notion. We might revisit the whole issue. (Please sign with ~ ). Charles Matthews 10:10, 2 November 2006 (UTC)


 * This page (the Classical Treatment of Tensors) is impossible to understand from the perspective of someone (me) with a B.S. in Math but who does not already know what a tensor is. Also, why does the discussion button for the "classical treatment of tensors" redirect to a page titled "talk:intermediate treatment of tensors"? Halberdo (talk) 03:37, 16 January 2009 (UTC)

Brackets in indices
I notice the indices are enclosed in brackets. In the classical notation square bracked means skew symmetrize and round brackets symmetrize. Why are they here? Billlion 16:01, 13 May 2007 (UTC)


 * I image it's to be meta-syntactical. I suppose they can be just taken out because people probably get that they can put anything there.  A better question: why does the classical treatment of tensors's talk page redirect to this one? Kevin Baastalk 16:56, 27 January 2009 (UTC)

General tensor
It seems to me there is a mistake here, a general tensor is not always the product of rank 1 tensors. Ylebru 11:23, 27 July 2007 (UTC)
 * Really?? I suppose a tensor of rank 0 would not be the product of rank 1 tensors, but can you name any other examples?   Saying it's not always the product is like saying that a natural number is not always the sum of adding 1 with itself multiple times.  And that directly contradicts the definition of the set of natural numbers. Kevin Baastalk 16:53, 27 January 2009 (UTC)
 * It could be the sum of such tensors, couldn't it (as opposed to being simply an outer product of vectors and 1-forms)?Dependent Variable (talk) 17:13, 17 September 2009 (UTC)

gobbledy-gook

 * "''In general, the value of a tensor field at an event in spacetime is an element of a vector space which is the tensor product of multiple copies of the tangent space (contravariant vectors) and multiple copies of the cotangent space (covariant vectors). As such, it is a smooth (C∞) mapping from the base space of a vector bundle to the total space which when projected back onto the base space has returned to its starting point."

Umm, yeah... what he said.

I think what he means to say is that you look nice today. (Forgive him, he doesn't know the dialect very well.)

Can we put things in english or not at all please? Thank you. Kevin Baastalk 17:00, 27 January 2009 (UTC)

Typo in indices of equation?
Shouldn't the equation in the section Transformation Rules


 * $$\hat{T}^{i_1\ldots i_p}_{\,j_1\ldots j_q} =

A^{i_1} {}_{k_1}\cdots A^{i_q} {}_{k_q} B^{l_1} {}_{j_1}\cdots B^{l_p} {}_{j_p} T^{k_1\ldots k_p}_{l_1\ldots l_q} $$

include a typo in the indices? Shouldn't it rather be


 * $$\hat{T}^{i_1\ldots i_p}_{\,j_1\ldots j_q} =

A^{i_1} {}_{k_1}\cdots A^{i_p} {}_{k_p} B^{l_1} {}_{j_1}\cdots B^{l_q} {}_{j_q} T^{k_1\ldots k_p}_{l_1\ldots l_q}? $$

Jamesmelody (talk) 09:30, 26 March 2009 (UTC)


 * Also $$A$$ and $$B$$ are swapped. Has been corrected. -- Theowoll (talk) 18:35, 9 January 2010 (UTC)