A reference of basic facts of matrix algebra


Product of matrices

  • Product AB of two matrices A and B is defined only if number of columns of the matrix A is equal to the number of rows of the matrix B. That is, if A is an m\times n matrix, then for AB to be defined, B needs to be an n\times l matrix for some l.

  • If A = \begin{bmatrix} a_{11} & a_{12} & a_{13} & \cdots & a_{1n} \\ a_{21} & a_{22} & a_{23} & \cdots & a_{2n} \\ a_{31} & a_{32} & a_{33} & \cdots & a_{3n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & a_{m3} & \cdots & a_{mn} \end{bmatrix} and B = \begin{bmatrix} b_{11} & b_{12} & b_{13} & \cdots & b_{1l} \\ b_{21} & b_{22} & b_{23} & \cdots & b_{2l} \\ b_{31} & b_{32} & b_{33} & \cdots & b_{3l} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ b_{n1} & b_{n2} & b_{n3} & \cdots & b_{nl} \end{bmatrix}, then


    AB = \begin{bmatrix} a_{11} & a_{12} & a_{13} & \cdots & a_{1n} \\ a_{21} & a_{22} & a_{23} & \cdots & a_{2n} \\ a_{31} & a_{32} & a_{33} & \cdots & a_{3n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & a_{m3} & \cdots & a_{mn} \end{bmatrix}\begin{bmatrix} b_{11} & b_{12} & b_{13} & \cdots & b_{1l} \\ b_{21} & b_{22} & b_{23} & \cdots & b_{2l} \\ b_{31} & b_{32} & b_{33} & \cdots & b_{3l} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ b_{n1} & b_{n2} & b_{n3} & \cdots & b_{nl} \end{bmatrix} = \begin{bmatrix} \sum\limits_{k=1}^{n}a_{1k}b_{k1} & \sum\limits_{k=1}^{n}a_{1k}b_{k2} & \sum\limits_{k=1}^{n}a_{1k}b_{k3} & \cdots & \sum\limits_{k=1}^{n}a_{1k}b_{kl} \\ \sum\limits_{k=1}^{n}a_{2k}b_{k1} & \sum\limits_{k=1}^{n}a_{2k}b_{k2} & \sum\limits_{k=1}^{n}a_{2k}b_{k3} & \cdots & \sum\limits_{k=1}^{n}a_{2k}b_{kl} \\ \sum\limits_{k=1}^{n}a_{3k}b_{k1} & \sum\limits_{k=1}^{n}a_{3k}b_{k2} & \sum\limits_{k=1}^{n}a_{3k}b_{k3} & \cdots & \sum\limits_{k=1}^{n}a_{3k}b_{kl} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \sum\limits_{k=1}^{n}a_{mk}b_{k1} & \sum\limits_{k=1}^{n}a_{mk}b_{k2} & \sum\limits_{k=1}^{n}a_{mk}b_{k3} & \cdots & \sum\limits_{k=1}^{n}a_{mk}b_{kl} \end{bmatrix}

  • (AB)_{ij} = \sum\limits_{k=1}^{n}(A)_{ik}(B)_{kj}. Here, we used the notation (A)_{ij} to denote the element in the row i and column j of the matrix A.

  • A matrix product is equivalent to multiplying each of the rows of the left matrix separately by the right matrix. That is,


    if A = \begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix}, then AB\,\,=\,\,\begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix}B \,\,=\,\, \begin{bmatrix} \mathbf{a_1}B\\\mathbf{a_2}B\\\mathbf{a_3}B\\\vdots\\\mathbf{a_m}B\end{bmatrix}.


    Note that, \mathbf{a_1},\mathbf{a_2},\mathbf{a_3},\ldots,\mathbf{a_m} are row vectors here.

  • A matrix product is equivalent to multiplying each of the columns of the right matrix separately by the left matrix. That is,


    If B = \begin{bmatrix} \mathbf{b_1}\,\,\mathbf{b_2}\,\,\mathbf{b_3}\,\,\cdots\,\,\mathbf{b_l}\end{bmatrix}, then AB\,\,=\,\,A\begin{bmatrix} \mathbf{b_1}\,\,\mathbf{b_2}\,\,\mathbf{b_3}\,\,\cdots\,\,\mathbf{b_l}\end{bmatrix}\,\,=\,\,\begin{bmatrix} A\mathbf{b_1}\,\,A\mathbf{b_2}\,\,A\mathbf{b_3}\,\,\cdots\,\,A\mathbf{b_l}\end{bmatrix}


    Note that, \mathbf{b_1},\mathbf{b_2},\mathbf{b_3},\ldots,\mathbf{b_l} are column vectors here.

  • If A = \begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix} and B = \begin{bmatrix} \mathbf{b_1}\,\,\mathbf{b_2}\,\,\mathbf{b_3}\,\,\cdots\,\,\mathbf{b_l}\end{bmatrix}, then AB\,\,=\,\,\begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix}\begin{bmatrix} \mathbf{b_1}\,\,\mathbf{b_2}\,\,\mathbf{b_3}\,\,\cdots\,\,\mathbf{b_l} \end{bmatrix}\,\,=\,\, \begin{bmatrix} \mathbf{a_1}\mathbf{b_1} & \mathbf{a_1}\mathbf{b_2} & \mathbf{a_1}\mathbf{b_3} & \cdots & \mathbf{a_1}\mathbf{b_l} \\ \mathbf{a_2}\mathbf{b_1} & \mathbf{a_2}\mathbf{b_2} & \mathbf{a_2}\mathbf{b_3} & \cdots & \mathbf{a_2}\mathbf{b_l} \\ \mathbf{a_3}\mathbf{b_1} & \mathbf{a_3}\mathbf{b_2} & \mathbf{a_3}\mathbf{b_3} & \cdots & \mathbf{a_3}\mathbf{b_l} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \mathbf{a_m}\mathbf{b_1} & \mathbf{a_m}\mathbf{b_2} & \mathbf{a_m}\mathbf{b_3} & \cdots & \mathbf{a_m}\mathbf{b_l} \end{bmatrix}


    Here, \mathbf{a_1},\mathbf{a_2},\mathbf{a_3},\ldots,\mathbf{a_m} are row vectors/matrices, \mathbf{b_1},\mathbf{b_2},\mathbf{b_3},\ldots,\mathbf{b_l} are column vectors/matrices and \mathbf{a_i}\mathbf{b_j} is just the matrix product of the \mathbf{a_i} and \mathbf{b_j}. (Note that, multiplying a row matrix by a column matrix is same as taking their dot product treating them simply as vectors)

  • If A = \begin{bmatrix} \mathbf{a_1}\,\,\mathbf{a_2}\,\,\mathbf{a_3}\,\,\cdots\,\,\mathbf{a_m}\end{bmatrix} and B = \begin{bmatrix} \mathbf{b_1}\\\mathbf{b_2}\\\mathbf{b_3}\\\vdots\\\mathbf{b_m}\end{bmatrix}, then AB\,\,=\,\,\begin{bmatrix} \mathbf{a_1}\,\,\mathbf{a_2}\,\,\mathbf{a_3}\,\,\cdots\,\,\mathbf{a_m}\end{bmatrix}\begin{bmatrix} \mathbf{b_1}\\\mathbf{b_2}\\\mathbf{b_3}\\\vdots\\\mathbf{b_m}\end{bmatrix}\,\,=\,\,\sum\limits_{k=1}^{m}\mathbf{a_k}\mathbf{b_k}


    Here, \mathbf{a_1},\mathbf{a_2},\mathbf{a_3},\ldots,\mathbf{a_m} are n\times 1 column vectors/matrices, \mathbf{b_1},\mathbf{b_2},\mathbf{b_3},\ldots,\mathbf{b_l} are 1\times l row vectors/matrices and \mathbf{a_i}\mathbf{b_j} is the n\times l matrix product of \mathbf{a_i} and \mathbf{b_j}.

  • Left multiplication of a square matrix A by a diagonal matrix causes the rows of A to be multiplied by the diagonal entries. This is, A = \begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix} and \Lambda = \begin{bmatrix}\lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_m\end{bmatrix}, then


    \Lambda A\,\,=\,\,\begin{bmatrix}\lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_m\end{bmatrix}\begin{bmatrix} \mathbf{a_1}\\\mathbf{a_2}\\\mathbf{a_3}\\\vdots\\\mathbf{a_m}\end{bmatrix} \,\,=\,\, \begin{bmatrix} \lambda_1\mathbf{a_1}\\\lambda_2\mathbf{a_2}\\\lambda_3\mathbf{a_3}\\\vdots\\\lambda_m\mathbf{a_m}\end{bmatrix}

  • Right multiplication of a square matrix A by a diagonal matrix causes the columns of A to be multiplied by the diagonal entries. This is, if A = \begin{bmatrix} \mathbf{a_1}\,\,\mathbf{a_2}\,\,\mathbf{a_3}\,\,\cdots\,\,\mathbf{a_m}\end{bmatrix} and \Lambda = \begin{bmatrix}\lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_m\end{bmatrix}, then


    A\Lambda\,\,=\,\,\begin{bmatrix}\mathbf{a_1}\,\,\mathbf{a_2}\,\,\mathbf{a_3}\,\,\cdots\,\,\mathbf{a_m}\end{bmatrix}\begin{bmatrix}\lambda_1 & 0 & 0 & \cdots & 0 \\ 0 & \lambda_2 & 0 & \cdots & 0 \\ 0 & 0 & \lambda_3 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & \lambda_m\end{bmatrix} \,\,=\,\, \begin{bmatrix} \lambda_1\mathbf{a_1}\,\,\lambda_2\mathbf{a_2}\,\,\lambda_3\mathbf{a_3}\,\,\cdots\,\,\lambda_m\mathbf{a_m}\end{bmatrix}


Matrix Transpose and Conjugation

  • Given a m\times n matrix A, its transpose A^T is a n\times m matrix such that \left(A^T\right)_{ij} = A_{ji}
    • Another way to generate transpose of a matrix is make rows of the original matrix into columns and the columns of the original matrix into rows.
    • For a square matrix, another way to generate transpose of the matrix is to “reflect” elements about the diagonal.
    • Some properties of matrix transpose are:
      • (nA)^T = nA^T
      • \left(A^T\right)^T = A
      • \left(A+B\right)^T = A^T+B^T. More generally, \left(A_1+A_2+\cdots +A_n\right)^T = A_1^T+A_2^T+\cdots+A_n^T.
      • \left(AB\right)^T = B^TA^T. More generally, \left(A_1A_2\cdots A_n\right)^T = A_n^TA_{n-1}^T\cdots A_1^T.
    • A square matrix A is symmetric if A^T = A.
      • A matrix is symmetric if its elements “above” the diagonal coincide with their reflections “below” the diagonal.
  • Given a m\times n complex matrix, its conjugate \overline{A} is a m\times n matrix such that \left(\overline{A}\right)_{ij} = \overline{A_{ij}}.
    • Some properties of matrix conjugate are:
      • \overline{nA} = n\overline{A}
      • \overline{\overline{A}} = A
      • \overline{A+B} = \overline{A}+\overline{B}. More generally, \overline{A_1+A_2+\cdots +A_n} = \overline{A_1}+\overline{A_2}+\cdots+\overline{A_n}.
      • \overline{AB} = \overline{A}\overline{B}. More generally, \overline{A_1A_2\cdots A_n} = \overline{A_1}\overline{A_2}\cdots\overline{A_n}
    • \overline{A} = A if and only if A is real.
  • Given a m\times n complex matrix A, its conjugate transpose or its Hermitian transpose A^H (alternatively, denoted A^*) is a n\times m matrix such that \left(A^H\right)_{ij} = \overline{A_{ji}}.
    • A^H = \overline{\left( A^T\right)} = \left(\overline{A}\right)^T.
    • A^H = A^T if and only if A is real.
      • Some properties of conjugate transpose are:
        • (nA)^H = nA^H
        • \left(A^H\right)^H = A
        • \left(A+B\right)^H = A^H+B^H. More generally, \left(A_1+A_2+\cdots +A_n\right)^H = A_1^H+A_2^H+\cdots+A_n^H.
        • \left(AB\right)^H = B^HA^H. More generally, \left(A_1A_2\cdots A_n\right)^H = A_n^HA_{n-1}^H\cdots A_1^H.
      • A matrix is called Hermitian if A^H = A.
        • A square matrix is Hermitian implies its diagonal entries are real.
        • A matrix is Hermitian implies its elements above the diagonal coincide with the conjugates of the elements below the diagonal.
        • A matrix is symmetric does not imply that it is Hermitian. A symmetric matrix is Hermitian if and only if it is real.
    • A square matrix is unitary if A^{-1} = A^H – that is, if A^HA = AA^H = I, where I is identity matrix.
      • The inverse of a unitary matrix can be simply computed by transposing and conjugating the matrix.
      • Note that, for a Hermitian matrix A^H = A, while for a unitary matrix A^H = A^{-1}.
      • A matrix is unitary if and only if its columns are orthogonal to each other.
      • A matrix is unitary if and only if its rows are orthogonal to each other.
    • As an example, the N-point discrete Fourier transform (DFT) matrix F is an N\times N matrix with F_{ij} = \frac{1}{\sqrt{N}}\omega_N^{ij}, where \omega_N = e^{-i\frac{2\pi}{N}}. This matrix is symmetric and unitary.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top