Talk:Tensor product/Archive 1

Ack! Too hard!!! My brain hurts! Can someone rewrite this so that a mere physicist like myself can understand it? -- Tim Starling 04:15 Mar 4, 2003 (UTC)

hmm... there seems to be a section under matrix multiplication on Kronecker product/direct product, which is the same thing as tensor product as far as i know, with a slightly different definition.(the rank is neglected) I know that my definition is correct. (the rank is relevant). The question is, how do we remedy this duplicity? Kevin Baas 2003.03.14

How's that for a start? See people, like this. Why are mathematicians bad prose writers? Sigh... I don't mean to invade on this. It just needs to be written clearly. Kevin Baas 2003.03.14


 * Thank you, yes it's certainly a good start. I like the split format ("pretentious part starts here") - it's ugly enough to encourage contributors to properly integrate the article.


 * As for duplicity, personally I like repetition in Wikipedia. After all, there's essentially no space limit and a huge pool of contributors, so you may as well give the reader the most specific, tailor made information possible. I think it would be enough to explain the connection in both articles, with links of course. Since matrix multiplication is a more general subject, the Kronecker product section should be kept brief, perhaps indicating that more information is to be found here. See Consolidating v/s breaking up. -- Tim Starling 06:57 Mar 15, 2003 (UTC)


 * Certainly, the way the article was a year ago (all defs, results; no explanation or motivation) was inadequate. But just because an adequate introduction or motivation is missing doesn't mean the formal presentation is necessarily "esoteric" or "pretentious". The "theoretical" presentation part of the article is pretty common style for math writing (although it could be improved) and it lays out the basic defs of tensor products...maybe the category theory part could be separated out...(the "universal property" stuff, etc.; this (category theory) is common mathematical presentation, but maybe not to physicists or others. Revolver 20:45, 19 Feb 2004 (UTC)

The statement about the rank can't be possibly true:
 * $$\begin{bmatrix}a\end{bmatrix} \otimes \begin{bmatrix}b\end{bmatrix} = \begin{bmatrix}ab\end{bmatrix}$$

or still worse:
 * $$\begin{bmatrix}0\end{bmatrix} \otimes \begin{bmatrix}b\end{bmatrix} = \begin{bmatrix}0\end{bmatrix}$$

Also, I think the example should be given as a tensor product of matrices from which the vector case can be derived easily. -- looxix 14:12 Mar 15, 2003 (UTC)

---

Maybe a tensor with 1 dimension has undefined rank? this is no disproof. i have very authoritave sources that verify the fact that the ranks are summed by a tensor product. can you tell me whether [0] has rank 1 or 2, or even 7? it's a 1x1x1x1x1x1.... tensor. But isn't 1 also $$e^{i*2\pi}$$?

The 'rank' of a tensor is actually not defined by the columns, rows, ect., but by an equation such as:

$$\bar{T}^{ij} = T^{rs}\frac{\partial \bar{x}^i}{\partial x^r}\frac{\partial \bar{x}^j}{\partial x^s}$$

where the rank of such T is 2 because it has order 2, regardless of how many $$x^i$$'s or $$x^j$$'s there are.

kevin -2003.03.15


 * Nothing is wrong if the rank is the rank_(tensor), but it is linked to Rank of a matrix, which is (loosely speaking) the degree of linear dependency of a matrix and was what I talked about in my remark. -- looxix 17:25 Mar 15, 2003 (UTC)


 * That's my fault -- Kevin linked to rank and I disambiguated. I haven't studied tensors before, so I wouldn't know one from the other. -- Tim Starling 23:02 Mar 15, 2003 (UTC)


 * i fixed that. looxix, i don't understand what you mean by "giving the example as a tensor prodcut of matrices from which the vector result can be derived easily" - how would i express a tensor with rank>2 without using embedded matrices, which would be potentially confusing? -- Kevin Baas -2003.03.15
 * What I meant was an example like:
 * $$\begin{bmatrix}a_{11} & a_{12} & a_{13}\\a_{21} & a_{22} & a_{23}\end{bmatrix} \otimes B = \begin{bmatrix}a_{11}B & a_{12}B & a_{13}B\\a_{21}B & a_{22}B & a_{23}B \end{bmatrix} = \dots$$
 * But this is in fact what is called matrix direct product, sometimes also called matrix tensor product.
 * What should be better is a general formula such:
 * $$A_{ij\dots}^{k\dots} \otimes B_{m\dots}^{npq\dots} = C_{ijm\dots}^{knpq\dots} \Rightarrow c_{ijm\dots}^{knpq\dots} = a_{ij\dots}^{k\dots} \times b_{m\dots}^{npq}$$

This "matrix direct product" is also called a "Kronecker product". Michael Hardy 01:22 Mar 16, 2003 (UTC)

-

The way it is defined in matrix multiplication, there is a slight difference between the matrix direct product/kronecker product and the tensor product, namely regarding the rank as previously discussed.

looxix's idea above, at least the one with the matrix, may be a good idea. what i'm most concerned about is visualization. i don't want just a simple formula that means nothing. ofcourse, they should know the formula and be able to work with it fluently, but they should be able to forget it, and then reconstruct it from the visualization. The general formula seems like it would be a bit confusing to someone not comfortable with tensor calculus. it is important and should be on the page, but i think the visualization should take higher priority.

In any case, Tensor-classical has rewrite suggestions in the talk. I think this should be carried through with first, as it is the groundwork/pinnacle of the related pages.

Kevin Baas -2003.03.15

In the sentence "Universal property of tensor product: The space of all multilinear maps from V xW to R is naturally isomorphic to the space of all linear maps from V &otimes;W to R"...can naturally be removed? Kingturtle 05:15 May 5, 2003 (UTC)


 * Not really. "Naturally" here has a precise meaning in terms of category theory; it's a technical term, not a subjective description. Revolver 20:45, 19 Feb 2004 (UTC)

I've added something about the need to complete with tensor products of Hilbert spaces - but this is tentative since it isn't really one of my strong fields.

Charles Matthews 15:45, 1 Mar 2004 (UTC)

I'm just a computer projrammer, not a mathematician. Will someone please tell me how to compute the values


 * $$ ( S^{\left[i_1,i_2,...i_p\right]}_{\left[j_1,j_2,...j_q\right]} \otimes

T^{\left[k_1,k_2,...k_r\right]}_{\left[l_1,l_2,...l_s\right]} )^{whatever}_{whatever} $$

Do S and T have to be of the same order? Ie, if $$i_1 ..., j_1 ...$$ can vary between 1 and 3 (0 and 2 for us programmers), does that mean that $$k_1 ..., l_1 ...$$ etc must be the same?


 * Computationally, there should be no problem taking two arrays A and B and forming all products a[i]b[j], indexed by a pair [i,j] where i is an index from A and j an index from B. One can do this with any arrays containing the same kind of scalars. You could say that the whole point of tensor theory is to understand what this operation does for you. Charles Matthews 05:53, 11 May 2004 (UTC)

So we are saying that


 * $$ ( S \otimes T)^{\left[i_1,i_2,...i_p, k_1,k_2,...k_r\right]}_{\left[j_1,j_2,...j_q, l_1,l_2,...l_s\right]} = S^{\left[i_1,i_2,...i_p\right]}_{\left[j_1,j_2,...j_q\right]} \times

T^{\left[k_1,k_2,...k_r\right]}_{\left[l_1,l_2,...l_s\right]}$$

Right? Does this mean that ordinary matrix multiplication is not the same as the tensor product of two square matrixes?


 * Matrix multiplication is a 'contraction' of a 'selection'. If I have two 10x10 matrices to multiply, out of 10000 possible products, only 1000 are relevant, and they are summed 10 at a time to get the entries of the product. Charles Matthews 07:01, 19 May 2004 (UTC)

If a, b, and c, are rank-one tensors (i.e. one-dimensional arrays), with indices i,j,k, respectively, then the tensor product of them is a rank-three tensor(i.e. three-dimensional array):

for( int i = 0; i < i_dim; i++) for( int j = 0; j < j_dim; j++) for( int k = 0; k < k_dim; k++) result[i][j][k] = a[i]*b[j]*c[k];

If a is a rank-two tensor and b is a rank-one tensor, with indices i & j, and k, respectively, then the tensor product of them is a rank-three tensor:

for( int i = 0; i < i_dim; i++) for( int j = 0; j < j_dim; j++) for( int k = 0; k < k_dim; k++) result[i][j][k] = a[i][j]*b[k];

Kevin Baas 20:16, 19 May 2004 (UTC)

The upper and the lower indices operate like Einstein summation: when the same term shows up on top and on bottom, that term can be dropped out via an inner product. For example, aki bjk = 

for( int i = 0; i < i_dim; i++) for( int j = 0; j < j_dim; j++) { result[i][j] = 0; for( int k = 0; k < k_dim; k++) result[i][j] += a[i][k]*b[j][k]; }

However, one might have to take into account curvature. In riemannian geometry, this might be done with a metric. Kevin Baas 20:50, 19 May 2004 (UTC)

No curvature involved. Charles Matthews 21:41, 19 May 2004 (UTC)

Would anyone object to me putting the above explanation of tensor product for a programmer into the article? Seeing that a computer i.e. turing machine, theoretically, is the embodiment of all possible mathematical structures in Principia Mathematica, I would say it's quite a precise mechanism of description. Also, it's a concise way to be perfectly clear and precise to anyone familiar with programming: they read the code once and say "I get it!". Would anyone object to me making a small section called something like "Tensors in computer programming", wherein would be the above two code snippets? Kevin Baas 21:52, 2004 Jun 21 (UTC)