Wikipedia:Articles for deletion/Matrix representation of tensors


 * The following discussion is an archived debate of the proposed deletion of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review).  No further edits should be made to this page.

The result was delete. Yunshui 雲 水 09:51, 26 February 2020 (UTC)

Matrix representation of tensors

 * – ( View AfD View log  Stats )

The sourcing of this article is quite poor (one of them is a link to a submission on Vixra out of all things, and the other is not even by a career mathematician). The writing contains significant spelling and grammar errors and would require a significant overhaul.

It is even wrong in many cases. For example, we are not required to work in a fixed orthonormal basis. If I apply a change of basis in $$\mathbb{R}^n$$, then the metric tensor simply changes accordingly and we get a correct form of the metric tensor for the new basis. In fact, the point being missed by both cited authors is emphatically that a linear transformation is a (1, 1) tensor while a bilinear form (in particular, including metric tensors) is a type (0, 2) tensor that operates by taking the transpose of one vector before multiplying the result of that by the matrix and the other argument (in this order). I am not that proficient in differential geometry but, given how the rest of our texts on the metric tensor, based on scholarly works, reject the notion that this is something "wrong" with the matrix notation, this article comes off as a WP:COATRACK, specifically a thinly-veiled pushing of this WP:FRINGE view (as evidenced by the unreliability of these two sources). In other words, the whole article appears to be a gross violation of WP:DUE. This aside from the fact that neither the Christoffel symbols nor the Levi-Civita symbol are tensors.

On a broader note, this topic likely is notable, but the problems with the article are so severe that a complete rewrite is needed. Jasper Deng (talk) 08:21, 19 February 2020 (UTC)
 * Note: This discussion has been included in the list of Mathematics-related deletion discussions. Jasper Deng (talk) 08:21, 19 February 2020 (UTC)


 * Delete This article is essentially a content fork of tensor, based on the standard fact that tensors form a vector space and matrices can be identified with (1, 1) tensors. At most, this could be summarized in one or two lines in tensor, if this is not there (I have not read this article in details). D.Lazard (talk) 09:09, 19 February 2020 (UTC)


 * Not delete Replay to D.Lazard: Please, first read the source articles in details - and then point out where exactly is the error/problem (it's best to use an example). In first source (web page) - the author clearly shows where is the problem using as example metric tensor from relativity theory - please study this and point out where is problem with author logic. Kamil Kielczewski (talk) 11:02, 19 February 2020 (UTC)


 * Comment Replay to Jasper Deng: you write:
 * (...) while a bilinear form (in particular, including metric tensors) is a type (0, 2) tensor that operates by taking the transpose of one vector before multiplying the result of that by the matrix and the other argument (in this order). - so I understand that $$[T_{ij}]\cdot[v^i]$$ is "changed" to $$[T_{ij}]\cdot[v_i]$$ (there is still problem with matrix multiplication (is forbidden to multiply matrix in right side by row vector) and also output result variance in this approach) - and some implicit transposition is imposed - I don't think it is true - can you provide source which shows/proof your words (or give explicite proof)? If you not read in details my sources please do it again - they point out (in example with metric tensor from relativity theory) that problem with "wrong matrix notation" is that it gives wrong result (variance) AFTER multiplication. As far I know something like "implicit transposition" doesn't exist (unless you want to introduce it as a some kind of complication). The sources are simple and clear - they shows examples - you can verify them using simple logic even if you don't feel, using your words: "that proficient in differential geometry". Kamil Kielczewski (talk) 11:02, 19 February 2020 (UTC)


 * I still stand by my statements, especially as more reputable sources like universally agree on this expression with the transpose. I taught undergraduate multivariable calculus and have strong understanding of matrix math, and all our textbooks on linear algebra use this convention for bilinear forms. Of course you can left-multiply a square matrix by a row vector of the same length: it’s the same as taking the transpose of the matrix, multiplying by the original column vector, and taking the transpose of the result. A general bilinear form can be given as $$\mathbf{x}^t \mathbf{Ay}$$, and the matrix you see as the "metric tensor" serves the role of A here. See first fundamental form for an explicit example. Remember, we're not looking for a row vector per se, but rather, a member of the dual space of the original vector space. The expression obtained by holding y constant here is scalar-valued and linear in x, therefore it is a covector; taking the transpose of $$\mathbf{Ay}$$ yields the desired row vector representation. If you cannot understand something as basic as this, then you really are in no position to be assessing correctness of content in this field. But that aside, you have an even more fundamental problem. These sources you cited are not reliable, and unjustifiably assume that all tensors operate by simple one-sided matrix multiplication, and somehow that this is the sole matrix representation of a tensor. This is wrong: the metric tensor is a clear example of where both the row and column indices are of the same kind. You also failed to address the vast amount of spelling and grammar mistakes, which I am not going out of my way to fix.--Jasper Deng (talk) 11:57, 19 February 2020 (UTC)
 * I agree with 's comment. It is not a task for a Wikipedia editor to verify correctness and value of cited articles. It is the work of journal editors and reviewers. Here both sources have never been reliably published, nor cited in WP:secondary sources. So they are original research (Wikipedia meaning), and the content of the article must be deleted per Wikipedia policy WP:NOR. The title of the article must also be deleted (that is the article must not be transformed into a redirect) since it is confusing for non experts because of the implicit confusion between arrays and matrices, which are different concepts, although related. Thanks to  whose "replays" make clearer that the content of the article is WP:OR. D.Lazard (talk) 12:04, 19 February 2020 (UTC)


 * the problem with "standard" but wrong matrix representation (or array representation) is very simple - lets look on 2D case
 * $$\begin{bmatrix}

-1 & 0 & 0 & 0 \\ 0  & 1 & 0 & 0 \\ 0  & 0 & 1 & 0 \\ 0  & 0 & 0 & 1 \\ \end{bmatrix}$$
 * for this representation you cannot determine tensor variance (it loose information) - you don't know if this is $$g_{ij}$$, $$g^{ij}$$ or $$g_i^j$$. This error occurs in many literature sources. Notation presented in article solve this problem.
 * I don't care about this perceived nonstarter of a problem. True, a bare matrix could represent type (2, 0), (1, 1) (linear maps), or (0, 2) (bilinear forms). But remember that matrices are always with respect to a (set of) bases. In this case, it is always clear from context what kind of tensor is meant here (and super- and sub-script notation will always be used in complicated cases); if the metric tensor is meant, you know its output is a scalar, and therefore it is of type (0, 2), and thus the inverse of the tensor is of type (2, 0). It really isn't that complicated. There's no rule saying that we must encode the information of tensor type in the dimensions of a matrix representation. I perceive this issue as more you simply not having learned enough about tensors, and thus being confused. Ask at WP:RDMA if this still isn't clear to you, but repeating your point will do nothing to sway us, and cannot substitute for actual reliable sources.--Jasper Deng (talk) 12:41, 19 February 2020 (UTC)
 * I think that representation which loose information about object which represents is bad representation (and often leads to confusion)
 * That's your opinion, you are entitled to it, but it will not be Wikipedia's voice since you have failed to satisfy the verifiability requirements, and the standard form has been used without confusion by greats such as Stephen Hawking–so I doubt it causes any significant confusion. We are not here to right the great wrongs you perceive.--Jasper Deng (talk) 13:47, 19 February 2020 (UTC)


 * Draftify and then later send through AfC. This is a brand-new article, not yet ready for mainspace, and any issues can be hashed out through usual editing processes and discussion. --JBL (talk) 13:09, 19 February 2020 (UTC)
 * Although the fact that Vixra is being used is a very bad sign. --JBL (talk) 13:11, 19 February 2020 (UTC)
 * (Hence why I deemed the article unsalvageable).—Jasper Deng (talk) 13:15, 19 February 2020 (UTC)
 * Delete Relies upon viXra -> beyond recovery. XOR&#39;easter (talk) 13:36, 19 February 2020 (UTC)
 * Because the words in the title are frequently used in conjunction with one another, a casual Google/Google Scholar search might give the impression that the topic is wiki-notable. But this article violates policy, there is literally nothing in it that can be salvaged, and we don't need an article with this title when tensor exists. XOR&#39;easter (talk) 17:05, 19 February 2020 (UTC)


 * Delete The writing is sub-draft-level, it presents a POV "basic principle" which it then violates for tensors of order >= 3, etc. Even though this article exists to go into specifics, the treatment in Tensor is better. Mgnbar (talk) 14:56, 19 February 2020 (UTC)
 * Delete. I see nothing here that would potentially be salvageable by allowing incubation as a draft. -- Kinu t/c 15:34, 19 February 2020 (UTC)
 * Merge or Delete I agree there isn't much in the way of reliable sources given, a cursory Google Search did turn up an article in International Journal of Mathematical Education in Science and Technology but it seems to be locked behind a $50 paywall. In any case, I didn't see any secondary sources. D.Lazard suggested if anything is salvageable here it could be added to the Tensor article and I tend to agree, but, reference issues aside, most of the material seems too WP:TEXTBOOKy for WP. --RDBury (talk) 16:42, 19 February 2020 (UTC)
 * I can't get through the paywall from where I am at the moment, but that paper has accumulated a grand total of 2 citations since 1984, suggesting that it was either wrong, boring, or both. If nobody else cared, we shouldn't either. XOR&#39;easter (talk) 17:01, 19 February 2020 (UTC)
 * I was able to, and while this is a promising text in terms of reliability, it does not at all support what the article author has been trying to insert into the article and appears to be an uncommon (minority) formulation. Email me (Special:EmailUser/Jasper Deng) if you'd like a copy. One source does not notability make.--Jasper Deng (talk) 23:10, 19 February 2020 (UTC)


 * DELETE. A (0,2) tensor can be represented as a matrix.  The rest of the article is unsourced, trivial, or misleading.  (Invited from WT:MATH.)  — Arthur Rubin  (talk) 18:11, 19 February 2020 (UTC)
 * Hands down. — Kamil Kielczewski (talk) 18:46, 19 February 2020 (UTC)
 * vixra as a source? Delete. &#32; Headbomb {t · c · p · b} 22:45, 23 February 2020 (UTC)


 * Delete. Appears to be a young students sudden realization that a rank-3 tensor is a cube of numbers, and an awkward attempted articulation thereof. Author appears unaware of the use of $$e_i$$ as a standard notation for basis vectors, which he appears to be re-inventing, de novo, with non-standard and awkward notation. From what I can tell, after quick glance, most of what is written there is is conceptually correct; it just doesn't connect with standard textbooks (taught in sophomore-year college, last I looked). As such, it doesn't convey the "a-hah" moment that students need to have. At any rate, first non-trivial example, the Christoffel symbols aren't even a tensor, as is well-known. 67.198.37.16 (talk) 00:57, 25 February 2020 (UTC)
 * Actually use cube of numbers for represent rank-3 tensor is not good because it lose information about tensor variance e.g. if you have cube of numbers then you cannot deduce that you deal with $$T_{ijk}, T^i_{\ jk}, T^{ijk},...$$ - this kind of representation is just wrong (however in cartesian coordinate system it is valid because we not distict co/contra-variance there). — Preceding unsigned comment added by Kamil Kielczewski (talk • contribs) 23:29, 25 February 2020 (UTC)


 * Delete- viXra articles as sources? No, this goes against the principle that Wikipedia articles need to be based on information in reliable, secondary sources. Reyk YO! 06:26, 25 February 2020 (UTC)


 * The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.