Quick facts on eigenvalues and eigenvectors

The following facts hold for eigenvalues and eigenvectors for matrices and linear maps in general:

  1. Eigenspace, which is the set of eigenvectors corresponding to a specific eigenvalue, is a subspace – that is, it is closed under linear combinations.
  2. For each eigenvalue, there could be one or more linearly independent eigenvectors.
  3. If there are n distinct eigenvalues, the set of eigenvectors, with one eigenvector picked for each eigenvalue, is linearly independent.
  4. For a linear map t:V\to V where V is n-dimensional, the map t could be diagonalized if and only if there are n linearly independent eigenvectors.
  5. For an n\times n matrix A, let there be n linearly independent eigenvectors \mathbf{u_1},\mathbf{u_2},\ldots,\mathbf{u_n} corresponding to eigenvalues \lambda_1,\lambda_2,\ldots,\lambda_n (not necessarily all distinct). Let U be the matrix \begin{bmatrix}\mathbf{u_1}\,\,\mathbf{u_2}\,\,\cdots\,\,\mathbf{u_n}\end{bmatrix} formed of eigenvectors arranged in columns and let \Lambda be the diagonal matrix \begin{bmatrix} \lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_n \end{bmatrix} formed of corresponding eigenvalues. Then, A is similar to \Lambda and A = U\Lambda U^{-1}.
  6. For a linear map t:V\to V where V is n-dimensional, if there are n linearly independent eigenvectors corresponding to eigenvalues \lambda_1,\lambda_2,\ldots,\lambda_n (not necessarily all distinct), then the representation of t with respect to the basis formed of the eigenvectors is the diagonal matrix \begin{bmatrix} \lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_n \end{bmatrix}.
  7. Say, a linear map t:V\to V where V is n-dimensional can be diagonalized – that is, it has n linearly independent eigenvectors corresponding to eigenvalues \lambda_1,\lambda_2,\ldots,\lambda_n (not necessarily all distinct). Then, the linear map t is an isomorphism if and only if none of the eigenvalues is 0.



We digress to state two facts about the ‘field’ used for scalars in defining vector spaces:

  • Defining vector spaces over complex scalars lets coefficients of vectors to be complex numbers.
  • The natural way to analyzing spaces with complex coefficients (for example, polynomials with complex coefficients, \mathbb{C}^n) is to allow complex scalars for the vector spaces. For example, if we allow only real scalars when analyzing \mathbb{C}^2, then \mathbb{C}^2 becomes a 4-dimensional space with (for example) the basis \langle (1,0),(i,0),(0,1),(0,i)\rangle. On the other other hand, if we allow scalars to be complex, then \mathbb{C}^2 becomes a 2-dimensional space with (for example) the basis \langle (1,0),(0,1)\rangle.

The facts above for eigenvalues and eigenvectors hold independent of whether the chosen ‘field’ of scalars is the set of real numbers or the set of complex numbers. Choosing complex scalars gives the following additional facts for eigenvalues and eigenvectors:

  1. Every matrix has at least one eigenvalue. Counting the multiplicity of eigenvalues, an n\times n matrix has n eigenvalues.
  2. For a matrix whose elements are all real, if an eigenvalue is complex, then its conjugate is also an eigenvalue for the matrix.



We now ask the reader to recall the definitions and properties related to transpose and conjugation of matrices. We use A^* to denote the conjugate transpose of a matrix A. The following facts on eigenvalues and eigenvectors are in the context of these definitions:

  1. The eigenvalues of a Hermitian matrix are all real.
  2. An n\times n Hermitian matrix has n orthonormal eigenvectors. This is true even if the Hermitian matrix does not have n distinct eigenvalues. Corresponding to an eigenvalue with multiplicity k, there are k orthonormal eigenvectors for the Hermitian matrix.
  3. Since the eigenvectors form an orthonormal basis for a Hermitian matrix, the matrix formed of the eigenvectors arranged in columns is a unitary matrix. In other words, let \mathbf{u_1},\mathbf{u_2},\ldots,\mathbf{u_n} be the eigenvectors of a Hermitian matrix H and let U denote \begin{bmatrix}\,\mathbf{u_1}\,\Bigg |\,\mathbf{u_2}\,\Bigg |\, \dots\, \Bigg |\,\mathbf{u_n}\, \end{bmatrix}. Then, UU^* = U^*U = I
  4. A Hermitian matrix H can be diagonalized using the above matrix U. Let \lambda_1, \lambda_2,\ldots,\lambda_n be the eigenvalues for the orthonormal eigenvectors \mathbf{u_1}, \mathbf{u_2},\ldots,\mathbf{u_n} of H and let \Lambda denote the diagonal matrix \begin{bmatrix} \lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_n \end{bmatrix}. Then, H = U\Lambda U^*, or equivalently, U^*HU = \Lambda. This fact merely follows from the previous fact, and the eigenvalue-eigenvector matrix equation HU = U\Lambda with U being nonsingular..
  5. Since real symmetric matrices are Hermitian, the above facts for Hermitian hold good for symmetric matrices as well.



We now ask the reader to recall the following definitions regarding the “definiteness” of a matrix:

  • An n\times n matrix A is positive definite if for all vectors \mathbf{x}\in \mathbb{C}^n and \mathbf{x}\ne \mathbf{0}, \langle A\mathbf{x}, \mathbf{x}\rangle > 0. Here, \langle A\mathbf{x}, \mathbf{x}\rangle = \mathbf{x}^*A\mathbf{x} denotes the inner product of column vectors A\mathbf{x} and \mathbf{x}.
  • An n\times n matrix A is negative definite if for all vectors \mathbf{x}\in \mathbb{C}^n and \mathbf{x}\ne \mathbf{0}, \langle A\mathbf{x}, \mathbf{x}\rangle < 0.
  • An n\times n matrix A is positive semi-definite if for all vectors \mathbf{x}\in \mathbb{C}^n, \langle A\mathbf{x}, \mathbf{x}\rangle \ge 0.
  • An n\times n matrix A is negative semi-definite if for all vectors \mathbf{x}\in \mathbb{C}^n, \langle A\mathbf{x}, \mathbf{x}\rangle \le 0.

The following facts state that the sign of eigenvalues of a Hermitian matrix is same as the “definiteness” of the matrix:

  1. A Hermitian matrix is positive definite if and only if all its eigenvalues are positive.
  2. A Hermitian matrix is negative definite if and only if all its eigenvalues are negative.
  3. A Hermitian matrix is positive semi-definite if and only if all its eigenvalues are non-negative.
  4. A Hermitian matrix is negative semi-definite if and only if all its eigenvalues are non-positive.

Proofs and justifications for the facts related to Hermitian matrices can be found here.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top