Javascript required
Skip to content Skip to sidebar Skip to footer

Find the Eigenvalues of the Given General Solution

\(\newcommand{\trace}{\operatorname{tr}} \newcommand{\real}{\operatorname{Re}} \newcommand{\imaginary}{\operatorname{Im}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

Consider the following system

\begin{equation} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} 2 \amp 1 \\ -1 \amp 4 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}.\label{equation-linear05-repeated-eigenvalues}\tag{3.5.1} \end{equation}

The characteristic polynomial of the system (3.5.1) is \(\lambda^2 - 6\lambda + 9\) and \(\lambda^2 - 6 \lambda + 9 = (\lambda - 3)^2\text{.}\) This polynomial has a single root \(\lambda = 3\) with eigenvector \(\mathbf v = (1, 1)\text{.}\) This there is a single straightline solution for this system (Figure 3.5.1). The strategy that we used to find the general solution to a system with distinct real eigenvalues will clearly have to be modified if we are to find a general solution to a system with a single eigenvalue.

Figure 3.5.1 A system with one straightline solution

Subsection 3.5.1 Repeated Eigenvalues

The remaining case the we must consider is when the characteristic equation of a matrix \(A\) has repeated roots. The simplest such case is

\begin{equation*} \begin{pmatrix} dx/dt \\ dy/dt \end{pmatrix} = \begin{pmatrix} \lambda & 0 \\ 0 & \lambda \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} = A \begin{pmatrix} x \\ y \end{pmatrix}. \end{equation*}

The eigenvalues of \(A\) are both \(\lambda\text{.}\) Since \(A{\mathbf v} = \lambda {\mathbf v}\text{,}\) any nonzero vector in \({\mathbb R}^2\) is an eigenvector for \(\lambda\text{.}\) Thus, solutions to this system are of the form

\begin{equation*} {\mathbf x}(t) = \alpha e^{\lambda t} {\mathbf v}. \end{equation*}

Each solution to our system lies on a straight line through the origin and either tends to the origin if \(\lambda \lt 0\) or away from zero if \(\lambda \gt 0\text{.}\)

A more interesting case occurs if

\begin{equation*} A = \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}. \end{equation*}

Again, both eigenvalues are \(\lambda\text{;}\) however, there is only one linearly independent eigenvector, which we can take to be \((1, 0)\text{.}\) Therefore, we have a single straight-line solution

\begin{equation*} {\mathbf x}_1(t) = \alpha e^{\lambda t}\begin{pmatrix} 1 \\ 0 \end{pmatrix}. \end{equation*}

To find other solutions, we will rewrite the system as

\begin{align*} x' & = \lambda x + y\\ y' & = \lambda y. \end{align*}

This is a partially coupled system. If \(y \neq 0\text{,}\) the solution of the second equation is

\begin{equation*} y(t) = \beta e^{\lambda t}. \end{equation*}

Therefore, the first equation becomes

\begin{equation*} x' = \lambda x + \beta e^{\lambda t}, \end{equation*}

which is a first-order linear differential equation with solution

\begin{equation*} x(t) = \alpha e^{\lambda t} + \beta t e^{\lambda t}. \end{equation*}

Consequently, a solution to our system is

\begin{equation*} \alpha e^{\lambda t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda t} \begin{pmatrix} t \\ 1 \end{pmatrix}. \end{equation*}

Example 3.5.2

Consider the linear system

\begin{align*} x' \amp = -x + y\\ y' \amp = -y\\ x(0) \amp = 1\\ y(0) \amp = 3. \end{align*}

The matrix that corresponds to this system is

\begin{equation*} A = \begin{pmatrix} -1 & 1 \\ 0 & -1 \end{pmatrix} \end{equation*}

has a single eigenvalue, \(\lambda = -1\text{.}\) An eigenvector for \(\lambda\) is \(\mathbf v = (1, 0)\text{.}\) Thus, the general solution to our system is

\begin{align*} x(t) \amp = c_1 e^{-t} + c_2 t e^{-t}\\ y(t) \amp = c_2 e^{-t}. \end{align*}

Applying the initial conditions \(x(0) = 1\) and \(y(0) = 3\text{,}\) the solution to our initial value problem is

\begin{align*} x(t) \amp = e^{-t} + 3te^{-t}\\ y(t) \amp = 3e^{-t}. \end{align*}

Notice that we have only one straightline solution (Figure 3.5.3).

Figure 3.5.3 Phase portrait for repeated eigenvalues

Subsection 3.5.2 Solving Systems with Repeated Eigenvalues

If the characteristic equation has only a single repeated root, there is a single eigenvalue. If this is the situation, then we actually have two separate cases to examine, depending on whether or not we can find two linearly independent eigenvectors.

Example 3.5.4

Suppose we have the system \(\mathbf x' = A \mathbf x\text{,}\) where

\begin{equation*} A = \begin{pmatrix} 2 & 0 \\ 0 & 2 \end{pmatrix}. \end{equation*}

The single eigenvalue is \(\lambda = 2\text{,}\) but there are two linearly independent eigenvectors, \(\mathbf v_1 = (1,0)\) and \(\mathbf v_2 = (0,1)\text{.}\) In this case our solution is

\begin{equation*} \mathbf x(t) = c_1 e^{2t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + c_2 e^{2t} \begin{pmatrix} 0 \\ 1 \end{pmatrix}. \end{equation*}

This is not too surprising since the system

\begin{align*} x' & = 2x\\ y' & = 2y \end{align*}

is uncoupled and each equation can be solved separately.

Example 3.5.5

Now let us consider the example \(\mathbf x' = A \mathbf x\text{,}\) where

\begin{equation*} A = \begin{pmatrix} 5 & 1 \\ -4 & 1 \end{pmatrix}. \end{equation*}

Since the characteristic polynomial of \(A\) is \(\lambda^2 - 6 \lambda + 9 = (\lambda - 3)^2\text{,}\) we have only a single eigenvalue \(\lambda = 3\) with eigenvector \(\mathbf v_1 = (1, -2)\text{.}\) This gives us one solution to our system, \(\mathbf x_1(t) = e^{3t}\mathbf v_1\text{;}\) however, we still need a second solution.

Since all other eigenvectors of \(A\) are a multiple of \(\mathbf v\text{,}\) we cannot find a second linearly independent eigenvector and we need to obtain the second solution in a different manner. We must find a vector \({\mathbf v}_2\) such that \((A - \lambda I){\mathbf v}_2 = {\mathbf v}_1\text{.}\) To do this we can start with any nonzero vector \({\mathbf w}\) that is not a multiple of \({\mathbf v}_1\text{,}\) say \({\mathbf w} = (1, 0)\text{.}\) We then compute

\begin{equation*} (A - \lambda I) {\mathbf w} = (A - 3I) {\mathbf w} = \begin{pmatrix} 2 & 1 \\ -4 & -2 \end{pmatrix} \begin{pmatrix} 1 \\ 0 \end{pmatrix} = \begin{pmatrix} 2 \\ -4 \end{pmatrix} = 2 {\mathbf v}_1. \end{equation*}

Thus, we can take \({\mathbf v}_2 = (1/2)\mathbf w = (1/2, 0)\text{,}\) and our second solution is

\begin{equation*} {\mathbf x}_2 = e^{\lambda t} ({\mathbf v}_2 + t {\mathbf v}_1) = e^{3t} \begin{pmatrix} 1/2 + t \\ -2t \end{pmatrix} \end{equation*}

Thus, our general solution is

\begin{equation*} {\mathbf x} = c_1 {\mathbf x}_1 + c_2 {\mathbf x}_2 = c_1 e^{3t} \begin{pmatrix} 1 \\ -2 \end{pmatrix} + c_2 e^{3t} \begin{pmatrix} 1/2 + t \\ -2t \end{pmatrix}. \end{equation*}

If the eigenvalue is positive, we will have a nodal source. If it is negative, we will have a nodal sink. Notice that we have only given a recipe for finding a solution to \(\mathbf x' = A \mathbf x\text{,}\) where \(A\) has a repeated eigenvalue and any two eigenvectors are linearly dependent. We will justify our procedure in the next section (Section 3.6).

Subsection 3.5.3 Important Lessons

  • If

    \begin{equation*} A = \begin{pmatrix} \lambda & 1 \\ 0 & \lambda \end{pmatrix}, \end{equation*}

    then \(A\) has one repeated real eigenvalue. The general solution to the system \({\mathbf x}' = A {\mathbf x}\) is

    \begin{equation*} {\mathbf x}(t) = \alpha e^{\lambda t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} + \beta e^{\lambda t} \begin{pmatrix} t \\ 1 \end{pmatrix}. \end{equation*}

    If \(\lambda \lt 0\text{,}\) then the solutions tend towards the origin as \(t \to \infty\text{.}\) For \(\lambda \gt 0\text{,}\) the solutions tend away from the origin.

Subsection Exercises

1

This is an exercise.

Subsection 3.5.4 Project

Find the Eigenvalues of the Given General Solution

Source: http://faculty.sfasu.edu/judsontw/ode/html-20180819/linear05.html