Talk:Determinant

Right handed coordinante
the following sentence is not clear. "The determinant of a set of vectors is positive if the vectors form a right-handed coordinate system, and negative if left-handed." what does "right-handed coordinate system" means? the "coordinate system" article does not mention it. amit man

linear algebra/analytic geometry
linear independence/collinearity, Gram determinant, tensor, positive definite matrix (Sylvester's criterion), defining a plane, Line-line intersection, Cayley–Hamilton_theorem, cross product, Matrix representation of conic sections, adjugate matrix , similar matrix have same det (Similarity invariance) , Cauchy–Binet formula, Trilinear_coordinates, Trace diagram, Pfaffian

types of matrices
special linear group, special orthogonal group, special unitary group, indefinite special orthogonal group, modular group, unimodular matrix, matrices with multidimensional indices

number theory/algebra
Pell's equation/continued fraction?, discriminant, Minkowski's theorem/lattice, Partition_(number_theory), resultant, field norm, Dirichlet's_unit_theorem, discriminant of an algebraic number field

geometry, analysis
conformal map?, Gauss curvature, orientability, Integration by substitution, Wronskian, invariant theory, Monge–Ampère equation, Brascamp–Lieb_inequality, Liouville's formula, absolute value of cx numbers and quaternions (see 3-sphere), distance geometry (Cayley–Menger determinant), Delaunay_triangulation

open questions
Jacobian conjecture, Hadamard's maximal determinant problem

algorithms
polar decomposition, QR decomposition, Dodgson_condensation, Matrix_determinant_lemma, eigendecomposition a few papers: Monte carlo for sparse matrices, approximation of det of large matrices, The Permutation Algorithm for Non-Sparse Matrix Determinant in Symbolic Computation, DETERMINANT APPROXIMATIONS

examples
reflection matrix, Rotation matrix, Vandermonde matrix, Circulant matrix, Hessian matrix (Blob_detection), block matrix, Gram determinant, Elementary_matrix, Orr–Sommerfeld_equation, det of Cartan matrix

generalizations
Hyperdeterminant, Quasideterminant, Continuant (mathematics), Immanant of a matrix, permanent, Pseudo-determinant, det's of infinite matrices / regularized det / functional determinant (see also operator theory), Fredholm determinant , superdeterminant

other
Determinantal point process, Kirchhoff's theorem,

Precise definition in the introduction?
Sorry for having attempted this substantial edit without prior discussion! My main desire is to add to the introduction at least one definition that is uniquely true of the determinant.

Right now, the introduction doesn't define the determinant, though precise definitions do exist. Instead it just makes some statements which are true of many objects:

- It is a scalar function of a square matrix

- It characterizes some properties of that matrix. (This is a bit vague and contentless)

- It is nonzero only on invertible matrices and distributes over matrix multiplication (also true of any multiplicative function of the determinant, such as the square)

Can I lobby for at least one crisp, technical, honest-to-goodness *definition* of the determinant? For example, "the determinant is the product of the full set of complex eigenvalues of a matrix, with multiplicity." What do you think? Cooljeff3000 (talk) 12:09, 2 July 2024 (UTC)


 * As you need the determinant for defining "the full set of complex eigenvalues of a matrix, with multiplicity", such a circular definition does not belong to the lead. By WP:TECHNICAL, a definition is convenient for a lead only if it can be understood by non-specialists.
 * The less technical definition of a determinant that I know is the following: The determinant is the unique function of the coefficients of a square matrix such that the determinant of a product of matrices is the product of their determinants, and the determinant of a triangular matrix is the product of its diagonal entries.
 * This definition uses implicitely the fact the every matrix is similar to a triangular matrix. As these diagonal entries are clearly the eigenvalues of the initial matrix, your definition is immediately implied.
 * Personally, I do not find that this definition is convenient for the lead, as there are many other equivalent definitions, and this equivalence clearly does not belong to the lead.
 * So the best thing seems to not chnge the structure of the lead. D.Lazard (talk) 13:16, 2 July 2024 (UTC)
 * Finally, the determinant is completely characterized by the fact that the determinant of a product of matrices is the product of the determinants and that the detrminant of a triangular matrix is the product of its diagonal entries. I have added this, with a footnote explaining that this results from Gaussian elimination. D.Lazard (talk) 18:12, 2 July 2024 (UTC)
 * The Gaussian elimination section seems redundant with the following section, since it is essentially equivalent to LU decomposition of the matrix. –jacobolus (t) 20:00, 2 July 2024 (UTC)

Please write in English (if for the English Wikipedia)
The section Sum contains this passage:

"Conversely, if $$A$$ and $$B$$ are Hermitian, positive-definite, and size $$n\times n$$, then the determinant has concave $$n$$th root;"

This statement makes no sense in either English or mathematics.

I hope that someone knowledgeable about this subject will fix this.

— Preceding unsigned comment added by 2601:204:f181:9410:d8dc:6178:320e:f4d5 (talk) 01:11, 3 July 2024 (UTC)


 * I have fixed the paragraph. D.Lazard (talk) 09:04, 3 July 2024 (UTC)

Using column vectors to represent points
I believe that there was a time when geometric points were often represented by row vectors, but now they are usually represented by column vectors. I do not have any evidence for the first part of that statement, but for the second part I have found: If most people learn that points are represented by column vectors, then the 2D example at Determinant would be easier to understand if it just used the columns of $A$. It would talk about the vertices at $(0, 0)$, $(a, c)$, $(a + b, c + d)$, and $(b, d)$.
 * written in 1993 which says "Recent mathematical treatments of linear algebra and related fields invariably treat vectors as columns".
 * which says "The general convention seems to be that the coordinates are listed in the format known as a column vector".
 * Olver and Shakiban (Applied Linear Algebra, 2018) who say that the term "vector" without qualification means a column vector.
 * where a comment says "we typically write the coordinates of our points as columns".
 * The article transformation matrix which uses column vectors.

It would need a new image instead of File:Area parallellogram as determinant.svg. Also I think the proof about the signed area would need to use $v^{⊥}$ instead of $u^{⊥}$, although I have not completely worked that out. The section could still mention that the determinant of the transpose gives the same result.

Also, please note that the 3D example in that section already uses the columns of $A$. JonH (talk) 01:20, 10 July 2024 (UTC)


 * You must distinguish between vectors of $$\R^n$$ with are tuples and commonly denoted in a row between parentheses such as $$(a_1,\ldots, a_n), $$ and the corresponding row and column vectors that are matrices and are denoted between square brackets. In other words, a vector is a $n$-tuple that can be represented with either a $$n\times 1$$ matrix (column vector) or a $$1\times n$$ matrix (row vector). You are true when saying that the common convention for matrix computation is to represent vectors with their associated column matrix.
 * I did not find anything in the linked section that goes against these common conventions. However, the wording is rather confusing, and could certainly be improved. D.Lazard (talk) 09:02, 10 July 2024 (UTC)
 * At a second thought, the main confusion of this paragraph is that it confuse points, the tuples of their coordinates and the corresponding row and columns vectors. D.Lazard (talk) 09:12, 10 July 2024 (UTC)
 * Tuples and row vectors (or column vectors, depending on the source) are so commonly conflated in both use and notation that any pedantic clarification here needs to be written very carefully. Notation here is also far from standardized (tuples can be written with square, round, or angle brackets; matrices can be written with square or round brackets). Also points in Euclidean space (or geometric vectors in a Euclidean vector space) are not tuples per se, but can be represented as tuples relative to an arbitrary Cartesian coordinate system. The object and its representation as numerical data are also commonly conflated. –jacobolus (t) 13:04, 10 July 2024 (UTC)