Menu
QuantML.orgQuantML.org
  • Linear Algebra
  • Null Space
  • Echelon Form
  • Solve
    \(A\vec{x}=0\)
  • RREF

    Previously we saw how to get echelon form, pivot/free column vectors of a matrix say

    \(A\)
    .
    Now let's find
    \(\vec{x}\)
    that solves
    \(A\vec{x}=\vec{0}\)

    Solve
    \(A\vec{x}=0\)

    (say)
    \(A=\begin{bmatrix} \fbox{1} & 2 & 2 & 2\\ 0 & 0 & \fbox{2} & 4\\ 0 & 0 & 0 & 0\\ \end{bmatrix}\)

    Note that we can get free columns by a linear combination of pivot columns.
    So we are free to scale our free columns.
    So consider
    \(A'=\begin{bmatrix} \fbox{1} & \lambda_1*2 & 2 & \lambda_2*2\\ 0 & \lambda_1*0 & \fbox{2} & \lambda_2*4\\ 0 & \lambda_1*0 & 0 & \lambda_2*0\\ \end{bmatrix}\)
    and
    \(\lambda_i\in\mathbb{R}\ \forall i=\{1,2\}\)

    Because free columns are the linear combination of pivot columns, we can say that the Null space we get by
    \(A\vec{x}=\vec{0}\)
    is the same Null space we get by
    \(A'\vec{x}=\vec{0}\)
    So now we will find solution for
    \(A'\vec{x}=\vec{0}\)

    \(A'\vec{x}= \begin{bmatrix} \fbox{1} & \lambda_1*2 & 2 & \lambda_2*2\\ 0 & \lambda_1*0 & \fbox{2} & \lambda_2*4\\ 0 & \lambda_1*0 & 0 & \lambda_2*0\\ \end{bmatrix} \begin{bmatrix} x_1\\x_2\\x_3\\x_4\\ \end{bmatrix} = \begin{bmatrix} 0\\0\\0\\ \end{bmatrix}\)

    Let's see
    \(2\)
    approaches to achieve it.

    \(\text{Approach } 1\)

    So we have system of equation:
    \(\begin{matrix} x_1 & + & 2\lambda_1x_2 & + & 2x_3 & + & 2\lambda_2x_4 & = & 0 \\ 0 & + & 0 & + & 2x_3 & + & 4\lambda_2x_4 & = & 0 \\ 0 & + & 0 & + & 0 & + & 0 & = & 0 \\ \end{matrix} \)

    We are free to choose
    \(\lambda_1\)
    and
    \(\lambda_2\)
    so let
    \(\lambda_1=\frac{1}{x_2}\)
    ,
    \(\lambda_2=0\)

    Then our system of equation becomes:
    \(\begin{matrix} x_1 & + & 2 & + & 2x_3 & + & 0 & = & 0 \\ 0 & + & 0 & + & 2x_3 & + & 0 & = & 0 \\ 0 & + & 0 & + & 0 & + & 0 & = & 0 \\ \end{matrix} \)

    \(\Rightarrow x= \begin{bmatrix} -2\\1\\0\\0\\ \end{bmatrix}\)

    Now let's set
    \(\lambda_1,\lambda_2\)
    some different value to get
    \(x_2,x_4\)

    let
    \(\lambda_1=0\)
    ,
    \(\lambda_2=\frac{1}{x_4}\)

    So we have system of equation:
    \(\begin{matrix} x_1 & + & 0 & + & 2x_3 & + & 2 & = & 0 \\ 0 & + & 0 & + & 2x_3 & + & 4 & = & 0 \\ 0 & + & 0 & + & 0 & + & 0 & = & 0 \\ \end{matrix} \)

    \(\Rightarrow x= \begin{bmatrix} 2\\0\\-2\\1\\ \end{bmatrix}\)


    \(\text{Approach } 2\)

    \(A'\vec{x}= \begin{bmatrix} \fbox{1} & \lambda_1*2 & 2 & \lambda_2*2\\ 0 & \lambda_1*0 & \fbox{2} & \lambda_2*4\\ 0 & \lambda_1*0 & 0 & \lambda_2*0\\ \end{bmatrix} \begin{bmatrix} x_1\\x_2\\x_3\\x_4\\ \end{bmatrix} = \begin{bmatrix} 0\\0\\0\\ \end{bmatrix} \)

    We can write it as.
    \(A'\vec{x}= \begin{bmatrix} \fbox{1} & 2 & 2 & 2\\ 0 & 0 & \fbox{2} & 4\\ 0 & 0 & 0 & 0\\ \end{bmatrix} \begin{bmatrix} x_1\\ \lambda_1*x_2\\x_3\\ \lambda_2*x_4\\ \end{bmatrix} = \begin{bmatrix} 0\\0\\0\\ \end{bmatrix} \)

    We are free to choose
    \(\lambda_1\)
    and
    \(\lambda_2\)
    so let
    \(\lambda_1=\frac{1}{x_2}\)
    ,
    \(\lambda_2=0\)

    By solving it we get
    \(\Rightarrow x= \begin{bmatrix} -2\\1\\0\\0\\ \end{bmatrix}\)

    By choosing
    \(\lambda_1=0\)
    ,
    \(\lambda_2=\frac{1}{x_4}\)

    We get
    \(\Rightarrow x= \begin{bmatrix} 2\\0\\-2\\1\\ \end{bmatrix}\)


    So we have two special solution.
    They are special solution because we crafted those solution in such a way that, in one solution we didn't consider free vector
    \(c_4\)
    but consider
    \(c_2\)
    , and in another solution we didn't consider
    \(c_2\)
    but consider
    \(c_4\)
    .
    We can get all the possible
    \(\vec{x}\)
    that solves
    \(A'\vec{x}=0\)
    by taking linear combinations of those possible solutions.
    So
    \(x=\alpha \begin{bmatrix} -2\\1\\0\\0\\ \end{bmatrix} + \beta \begin{bmatrix} 2\\0\\-2\\1\\ \end{bmatrix};\quad\)
    \(\alpha,\beta\in\mathbb{R}\)

    Now we got the Null Space of matrix
    \(A\)
    it's a plane passing through
    \(\begin{bmatrix} -2\\1\\0\\0\\ \end{bmatrix}\)
    and
    \(\begin{bmatrix} 2\\0\\-2\\1\\ \end{bmatrix}\)


    Null Space is the linear combination of all special solutions.

    But how many special solutions are there?

    Say there are
    \(k\)
    dependent column vectors in our matrix
    \(A\)
    , then as we discussed we craft these special solution such that we consider one dependent vector at a time.
    So # special solution = # dependent column vector
    remember Rank(A) = # pivot columns
    Say that shape of our matrix is
    \(m\times n\)
    , so we have
    \(n\)
    column vectors in
    \(\mathbb{R}^m\)
    , and say Rank of the matrix is
    \(r\)
    , then
    # special solution = # dependent column vector =
    \(n-r\)
    Then the Null space of
    \(A\)
    is linear combinations of these
    \(n-r\)
    special solution.
    For a matrix
    \(A\in\mathbb{R}^{m\times n}\)
    with
    \(\text{Rank}(A)=r\)
    , Null Space is a vector space in
    \(\mathbb{R}^{n-r}\)