Talk:Jacobi's formula

Simplifying notation
How about using a specific notation for the cofactors in the proof, such as Cij, as in the cofactor and the Laplace expansion articles? I think it would make the proof more readable. The relation to the adjugate matrix could be stated at the beginning or/and at the end of the proof.

Eroblar 17:40, 1 September 2007 (UTC)

What field?
Is is not clear what kind of numbers the entries of the matrix may be for the formula to hold. Does is also hold for complex matrices? Would be great if someone could add this information to the statement of the theorem. —Preceding unsigned comment added by 128.232.241.65 (talk) 14:44, 26 October 2007 (UTC)

Finite matrices only?
The form in which the theorem is given could logically apply to infinite matrices also; but the details of the proof are for matrices of "the same dimension n". 78.32.103.197 (talk) 20:33, 24 May 2009 (UTC)

Symmetric Case
If the matrix (whose derivative is being computed) is a symmetric structure, does this affect the result? Because a change in the (i,j)th term will necessarily be felt in the (j,i)th element.... therefore the derivative must be "simultaneous". Right? — Preceding unsigned comment added by 2.123.253.142 (talk) 02:10, 3 January 2012 (UTC)

Faster proof?
The proof currently on the page is okay and proves it in many small steps, but it seems overly complicated and unnecessarily long. Isn't it possible to just give a one-line proof using the chain rule? We have

$$\frac{d}{dt}\mbox{Det}\left(A\left(t\right)\right)=\left(\nabla\mbox{Det}\left(A\left(t\right)\right)\right):\left(\frac{d}{dt}A\left(t\right)\right)=\mbox{Tr}\left(\mbox{adj}\left(A\left(t\right)\right)\frac{d}{dt}A\left(t\right)\right) $$

where denotes tensor double-contraction. I'll tack this onto the page as an "alternative proof", and if anybody objects, we can discuss it.

18.251.7.218 (talk) 12:17, 9 August 2013 (UTC)MathDoobler

You need to define a few things and set the context to get this to work (e.g. think of $$A(t)$$ as a curve through the manifold of matrices)---I doubt anyone who isn't you can follow it at present.

124.148.29.82 (talk) 02:37, 3 January 2014 (UTC)

My proposal (from Magnus & Neudecker, §8.3):

Theorem. (Jacobi's formula) For any differentiable map A from the real numbers to n × n matrices,


 * $$d \det (A) = \mathrm{tr} (\mathrm{adj}(A) \, dA).$$

Proof. Laplace's formula for the determinant of a matrix A can be stated as


 * $$\det(A)=\sum_i\sum_j C_{ij}A_{ij}$$

where $$C_{ij}$$ is the cofactor of $$A_{ij}$$.

The determinant of A can be considered to be a function of the elements of A:


 * $$\det(A) = F\,(A_{11}, A_{12}, \ldots, A_{21}, A_{22}, \ldots , A_{nn})$$

and $$\frac{\partial F}{\partial A_{ij}}=C_{ij}$$. So that, by the chain rule, its differential is:


 * $$d\det(A)=\sum_i\sum_j C_{ij}dA_{ij}$$

Since:


 * $$\mathrm{tr} (C^{\rm T} A) = \sum_j (C^{\rm T} A)_{jj} = \sum_j \sum_i C_{ij} A_{ij} = \sum_i \sum_j C_{ij} A_{ij}$$

and the adjugate is the transpose of the cofactor matrix, the differential is:


 * $$d\det(A)=\mathrm{tr}(C^{\rm T}dA)=\mathrm{tr} (\mathrm{adj}(A) \, dA).\ \square$$

--Leitfaden (talk) 23:09, 14 February 2014 (UTC)

Gobbledygook "proof" Jacobi's formula by User: Li Han
I begged said user to discuss and defend here the meaningless "proof" I reverted once, but he restored it, so I'm letting it stand hoping the reader will shrug it off and move on. I think edit warring is a terrible way for said user to start his WP career, assuming he is not a sockpuppet. In any case, the idiosyncratic inconsistent notation introduced is more trouble than otherwise; he ignores the mainstream notation and logic of the main proof explaining it to himself in impenetrable eccentric notation, without a trace of embarrassment. He is evidently expecting A to be invertible. I suspect he has just noticed that $$ A(t+\epsilon)=A + \epsilon ~dA/dt + O(\epsilon^2),$$ and assumed A is invertible, so that $$\frac{d}{d\epsilon} \det A(t+\epsilon)= \frac{d}{d\epsilon} \det \left(A( I+\epsilon A^{-1}\frac{dA}{dt} +O(\epsilon^2))\right)=$$ $$ \det A ~\frac{d}{d\epsilon} \det \left( I+\epsilon A^{-1}\frac{dA}{dt} +O(\epsilon^2)\right) =\det A (\operatorname{tr} A^{-1} dA/dt) + \!O(\epsilon) ,$$ as taught in elementary matrix reviews, and is anxious to stick this in, as a "proof", above the level of a formal wisecrack footnote, but the gobbledygook is so thick one can never be sure. He is still invited to propose something useful on this page, in mainstream notation. Cuzkatzimhut (talk) 14:50, 11 July 2017 (UTC)

Possible error
There is possibly an error in the initial section, where the special case says that the partial derivative of det(A) with respect to Aij is adj(A)ij. This seems to be in contrast with one of the steps of the proof, after the line "where δ is the Kronecker delta, so", where the same partial derivative is equal to adjT(A)ij. By looking at the proof, I believe it is correct and it's the special case statement that needs to be corrected. 87.5.237.63 (talk) 17:58, 5 February 2023 (UTC)


 * In the initial section is said that the partial derivative of det(A) with respect to Aij is adj(A)ji which is equal to adjT(A)ij Saung Tadashi (talk) 18:17, 5 February 2023 (UTC)