The following facts hold for eigenvalues and eigenvectors for matrices and linear maps in general:
- Eigenspace, which is the set of eigenvectors corresponding to a specific eigenvalue, is a subspace – that is, it is closed under linear combinations.
- For each eigenvalue, there could be one or more linearly independent eigenvectors.
- If there are
distinct eigenvalues, the set of eigenvectors, with one eigenvector picked for each eigenvalue, is linearly independent.
- For a linear map
where
is
-dimensional, the map
could be diagonalized if and only if there are
linearly independent eigenvectors.
- For an
matrix
, let there be
linearly independent eigenvectors
corresponding to eigenvalues
(not necessarily all distinct). Let
be the matrix
formed of eigenvectors arranged in columns and let
be the diagonal matrix
formed of corresponding eigenvalues. Then,
is similar to
and
.
- For a linear map
where
is
-dimensional, if there are
linearly independent eigenvectors corresponding to eigenvalues
(not necessarily all distinct), then the representation of
with respect to the basis formed of the eigenvectors is the diagonal matrix
.
- Say, a linear map
where
is
-dimensional can be diagonalized – that is, it has
linearly independent eigenvectors corresponding to eigenvalues
(not necessarily all distinct). Then, the linear map
is an isomorphism if and only if none of the eigenvalues is
.
We digress to state two facts about the ‘field’ used for scalars in defining vector spaces:
- Defining vector spaces over complex scalars lets coefficients of vectors to be complex numbers.
- The natural way to analyzing spaces with complex coefficients (for example, polynomials with complex coefficients,
) is to allow complex scalars for the vector spaces. For example, if we allow only real scalars when analyzing
, then
becomes a
-dimensional space with (for example) the basis
. On the other other hand, if we allow scalars to be complex, then
becomes a
-dimensional space with (for example) the basis
.
The facts above for eigenvalues and eigenvectors hold independent of whether the chosen ‘field’ of scalars is the set of real numbers or the set of complex numbers. Choosing complex scalars gives the following additional facts for eigenvalues and eigenvectors:
- Every matrix has at least one eigenvalue. Counting the multiplicity of eigenvalues, an
matrix has
eigenvalues.
- For a matrix whose elements are all real, if an eigenvalue is complex, then its conjugate is also an eigenvalue for the matrix.
We now ask the reader to recall the definitions and properties related to transpose and conjugation of matrices. We use to denote the conjugate transpose of a matrix
. The following facts on eigenvalues and eigenvectors are in the context of these definitions:
- The eigenvalues of a Hermitian matrix are all real.
- An
Hermitian matrix has
orthonormal eigenvectors. This is true even if the Hermitian matrix does not have
distinct eigenvalues. Corresponding to an eigenvalue with multiplicity
, there are
orthonormal eigenvectors for the Hermitian matrix.
- Since the eigenvectors form an orthonormal basis for a Hermitian matrix, the matrix formed of the eigenvectors arranged in columns is a unitary matrix. In other words, let
be the eigenvectors of a Hermitian matrix
and let
denote
. Then,
- A Hermitian matrix
can be diagonalized using the above matrix
. Let
be the eigenvalues for the orthonormal eigenvectors
of
and let
denote the diagonal matrix
. Then,
, or equivalently,
. This fact merely follows from the previous fact, and the eigenvalue-eigenvector matrix equation
with
being nonsingular..
- Since real symmetric matrices are Hermitian, the above facts for Hermitian hold good for symmetric matrices as well.
We now ask the reader to recall the following definitions regarding the “definiteness” of a matrix:
- An
matrix
is positive definite if for all vectors
and
,
. Here,
denotes the inner product of column vectors
and
.
- An
matrix
is negative definite if for all vectors
and
,
.
- An
matrix
is positive semi-definite if for all vectors
,
.
- An
matrix
is negative semi-definite if for all vectors
,
.
The following facts state that the sign of eigenvalues of a Hermitian matrix is same as the “definiteness” of the matrix:
- A Hermitian matrix is positive definite if and only if all its eigenvalues are positive.
- A Hermitian matrix is negative definite if and only if all its eigenvalues are negative.
- A Hermitian matrix is positive semi-definite if and only if all its eigenvalues are non-negative.
- A Hermitian matrix is negative semi-definite if and only if all its eigenvalues are non-positive.
Proofs and justifications for the facts related to Hermitian matrices can be found here.