Eigenvalues and Eigenvectors
Say that we have a\(n\times n\)
matrix \(A\)
and a vector \(\vec{v}\in\mathbb{R}^n\)
.When we perform
\(A\vec{v}\)
, think of it as transformation of \(\vec{v}\)
into a new coordinate system whose basis vector are defined by matrix \(A\)
.After the transformation most of the vectors changes there orientation, but there are some vectors that do not change there direction(length might has been changed) there vectors are called eigenvectors and the factor by which there length has been changed(say
\(\lambda\)
) is called eigenvalue.So we can say that after applying transformation
\(A,\)
eigenvectors don't changes there direction.\[A\vec{x}=\lambda\vec{x};\quad \left\{\begin{matrix} \vec{x}\text{ is eigenvector}\\ \lambda\text{ is eigenvalue} \end{matrix}\right.\]
Example
\(1\)
Let's say\(P\)is a projection matrix.
What are the eigenvalues and eigenvectors of\(P\)?
Let's say\(\vec{b}\in\mathbb{R}^n\)and the matrix space of\(P\)is\(n-1\)dimensional hyperplane.
Then when we perform\(P\vec{b}\)it project\(\vec{b}\)onto the matrix space of\(P\)(here it is obvious that\(\vec{b}\)isn't an eigenvector).\(P\)project all vector\(\vec{x}\)onto the matrix space of\(P\)but vectors which are already in the matrix space of\(P\)are unaffected.
So vectors which are already in the matrix space of\(P\)are eigenvector.
And they are unaffected by\(P\)so eigenvalue is\(1\).
There is one more eigenvector, vector that is perpendicular to the matrix space of\(P\)is also an eigenvector.
And projection of that vecor is\(\vec{0}\)so it's eigenvalue is\(0\).
Facts
Say we have aA \(n\times n\)matrix will have\(n\)eigenvalues
\(n\times n\)
matrix \(A\)
. \[ A=\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \cdots & a_{nn} \\ \end{bmatrix} \]
Then,Sum of eigenvalues \(=\)Sum of diagonal elements of matrix\(A\)\(= \)\(a_{11} + a_{22} + \cdots + a_{nn}\)
Product of eigenvalues is the determinant of \(A\)
Solving \(A\vec{x}=\lambda\vec{x}\)
We want to find the eigenvalues and eigenvectors of a matrix \(A\)
.eigenvectors does not change it's direction after applying the transformation
\(A\)
.\[A\vec{x}=\lambda\vec{x};\quad \vec{x}\text{ is eigenvector}\]
\(\Rightarrow A\vec{x} - \lambda\vec{x} = \vec{0}\)
\(\Rightarrow (A - \lambda\mathcal{I})\vec{x} = \vec{0}\)
So we have to find
\(\lambda\)
such that there is some \(\vec{x}\neq\vec{0}\)
in the Null Space of \(A - \lambda\mathcal{I}\)
.So we want some free variables in
\(A - \lambda\mathcal{I}\)
(or say we want \(\text{Rank}(A - \lambda\mathcal{I})\lt n\)
)This mean we want
\(A - \lambda\mathcal{I}\)
to be a singular matrix.This mean determinant of
\(A - \lambda\mathcal{I}=0\)
This will gives us\(\text{det}(A - \lambda\mathcal{I})=0\)
\(\lambda\)
so we now get our matrix \(A - \lambda\mathcal{I}\)
Now we have to find it's Null space we discussed it HERE.
Example
\[A=\begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix}\]We want to find the eigenvalues and eigenvectors of a matrix\(A\).\[A\vec{x}=\lambda\vec{x};\quad \vec{x}\text{ is eigenvector}\]\(\text{det}(A - \lambda\mathcal{I})=\begin{vmatrix} -\lambda & 1 \\ 1 & -\lambda \\ \end{vmatrix}\)\(\Rightarrow \text{det}(A - \lambda\mathcal{I})= \lambda^2-1\)
We know that\(\text{det}(A - \lambda\mathcal{I})=0\).\(\Rightarrow \lambda^2-1=0\)\(\Rightarrow \lambda= \pm 1\)
So eigenvalues are\(1,-1\).For \(\lambda=1\)
\((A-\lambda\mathcal{I})\vec{x}=\vec{0}\)\(\Rightarrow (A-\mathcal{I})\vec{x}=\vec{0}\)\(A-\mathcal{I}=\begin{bmatrix} -1 & 1 \\ 1 & -1 \\ \end{bmatrix} \begin{bmatrix} x_1\\x_2\\ \end{bmatrix}= \begin{bmatrix} 0\\0\\ \end{bmatrix}\)\(\Rightarrow \vec{x} = \begin{bmatrix} 1\\1\\ \end{bmatrix}\)For \(\lambda=-1\)eigenvectors of
\((A-\lambda\mathcal{I})\vec{x}=\vec{0}\)\(\Rightarrow (A+\mathcal{I})\vec{x}=\vec{0}\)\(A-\mathcal{I}=\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix} \begin{bmatrix} x_1\\x_2\\ \end{bmatrix}= \begin{bmatrix} 0\\0\\ \end{bmatrix}\)\(\Rightarrow \vec{x} = \begin{bmatrix} 1\\-1\\ \end{bmatrix}\)\(A\)are\(\begin{bmatrix} 1\\1\\ \end{bmatrix}\)and\(\begin{bmatrix} 1\\-1\\ \end{bmatrix}\)
What if we add a constant
\(\alpha\mathcal{I}\)
to \(A\)
and \(\alpha\in\mathbb{R}\)
.We know that
\(A\vec{x}=\lambda\vec{x};\quad \vec{x}\text{ is eigenvector}\)
.\(\Rightarrow A\vec{x} + \alpha\vec{x}=\lambda\vec{x} + \alpha\vec{x}\)
\(\Rightarrow (A + \alpha\mathcal{I})\vec{x}=(\lambda + \alpha)\vec{x}\)
So,If a matrix\(A\)has eigenvalue\(\lambda\).
Then\(A + \alpha\mathcal{I}\)has eigenvalue\(\lambda+\alpha\).
Example(
\(90^o\)
Rotation) \[A=\begin{bmatrix} 0 & -1 \\ 1 & 0 \\ \end{bmatrix}\]We want to find the eigenvalues and eigenvectors of a matrix\(A\).\[A\vec{x}=\lambda\vec{x};\quad \vec{x}\text{ is eigenvector}\]\(\text{det}(A - \lambda\mathcal{I})=\begin{vmatrix} -\lambda & -1 \\ 1 & -\lambda \\ \end{vmatrix}\)\(\Rightarrow \text{det}(A - \lambda\mathcal{I})= \lambda^2+1\)
We know that\(\text{det}(A - \lambda\mathcal{I})=0\).\(\Rightarrow \lambda^2+1=0\)\(\Rightarrow \lambda= \pm i\)
So eigenvalues are\(i,-i\).
eigenvalues not necessarily need to be real numbers, here eigenvalues are complex numbers.