Quadratic eigenvalue problem

In mathematics, the quadratic eigenvalue problem (QEP), is to find scalar eigenvalues $$\lambda$$, left eigenvectors $$y$$ and right eigenvectors $$x$$ such that


 * $$ Q(\lambda)x = 0 ~ \text{ and } ~ y^\ast Q(\lambda) = 0,$$

where $$Q(\lambda)=\lambda^2 M + \lambda C + K$$, with matrix coefficients $$M, \, C, K \in \mathbb{C}^{n \times n}$$ and we require that $$M\,\neq 0$$, (so that we have a nonzero leading coefficient). There are $$2n$$ eigenvalues that may be infinite or finite, and possibly zero. This is a special case of a nonlinear eigenproblem. $$Q(\lambda)$$ is also known as a quadratic polynomial matrix.

Spectral theory
A QEP is said to be regular if $$\text{det} (Q(\lambda)) \not \equiv 0$$ identically. The coefficient of the $$\lambda^{2n}$$ term in $$\text{det}(Q(\lambda))$$ is $$\text{det}(M)$$, implying that the QEP is regular if $$M$$ is nonsingular.

Eigenvalues at infinity and eigenvalues at 0 may be exchanged by considering the reversed polynomial, $$ \lambda^2 Q(\lambda^{-1}) = \lambda^2 K + \lambda C + M $$. As there are $$ 2n$$ eigenvectors in a $$n$$ dimensional space, the eigenvectors cannot be orthogonal. It is possible to have the same eigenvector attached to different eigenvalues.

Systems of differential equations
Quadratic eigenvalue problems arise naturally in the solution of systems of second order linear differential equations without forcing:


 * $$ M q''(t) +C q'(t) + K q(t) = 0 $$

Where $$ q(t) \in \mathbb{R}^n $$, and $$ M, C, K \in \mathbb{R}^{n\times n}$$. If all quadratic eigenvalues of $$ Q(\lambda) = \lambda^2 M + \lambda C + K $$ are distinct, then the solution can be written in terms of the quadratic eigenvalues and right quadratic eigenvectors as



q(t) = \sum_{j=1}^{2n} \alpha_j x_j e^{\lambda_j t} = X e^{\Lambda t} \alpha $$ Where $$\Lambda = \text{Diag}([\lambda_1, \ldots, \lambda_{2n}]) \in \mathbb{R}^{2n \times 2n} $$ are the quadratic eigenvalues, $$ X = [x_1, \ldots, x_{2n}] \in \mathbb{R}^{n \times 2n} $$ are the $$ 2n$$ right quadratic eigenvectors, and $$ \alpha = [\alpha_1, \cdots, \alpha_{2n}]^\top \in \mathbb{R}^{2n}$$ is a parameter vector determined from the initial conditions on $$ q$$ and $$ q'$$. Stability theory for linear systems can now be applied, as the behavior of a solution depends explicitly on the (quadratic) eigenvalues.

Finite element methods
A QEP can result in part of the dynamic analysis of structures discretized by the finite element method. In this case the quadratic, $$Q(\lambda)$$ has the form $$Q(\lambda)=\lambda^2 M + \lambda C + K$$, where $$M$$ is the mass matrix, $$C$$ is the damping matrix and $$K$$ is the stiffness matrix. Other applications include vibro-acoustics and fluid dynamics.

Methods of solution
Direct methods for solving the standard or generalized eigenvalue problems $$ Ax = \lambda x$$ and $$ Ax = \lambda B x $$ are based on transforming the problem to Schur or Generalized Schur form. However, there is no analogous form for quadratic matrix polynomials. One approach is to transform the quadratic matrix polynomial to a linear matrix pencil ($$ A-\lambda B$$), and solve a generalized eigenvalue problem. Once eigenvalues and eigenvectors of the linear problem have been determined, eigenvectors and eigenvalues of the quadratic can be determined.

The most common linearization is the first companion linearization

L1(\lambda) = \begin{bmatrix} 0 & N \\ -K & -C \end{bmatrix} - \lambda\begin{bmatrix} N & 0 \\ 0 & M \end{bmatrix}, $$ with corresponding eigenvector

z = \begin{bmatrix} x \\ \lambda x \end{bmatrix}. $$ For convenience, one often takes $$N$$ to be the $$n\times n$$ identity matrix. We solve $$ L(\lambda) z = 0 $$ for $$ \lambda $$ and $$z$$, for example by computing the Generalized Schur form. We can then take the first $$n$$ components of $$z$$ as the eigenvector $$x$$ of the original quadratic $$Q(\lambda)$$.

Another common linearization is given by

L2(\lambda)= \begin{bmatrix} -K & 0 \\ 0 & N \end{bmatrix} - \lambda\begin{bmatrix} C & M \\ N & 0 \end{bmatrix}. $$

In the case when either $$A$$ or $$B$$ is a Hamiltonian matrix and the other is a skew-Hamiltonian matrix, the following linearizations can be used.

L3(\lambda)= \begin{bmatrix} K & 0 \\ C & K \end{bmatrix} - \lambda\begin{bmatrix} 0 & K \\ -M & 0 \end{bmatrix}. $$

L4(\lambda)= \begin{bmatrix} 0 & -K \\ M & 0 \end{bmatrix} - \lambda\begin{bmatrix} M & C \\ 0 & M \end{bmatrix}. $$