# Higher order Linear ODEs#

Mathematics Methods 2

## Wronskian#

We define a Wronskian of two functions as the determinant

$\begin{split} W[f,g](x) = \left| \begin{array}{cc} f(x) & g(x) \\ f'(x) & g'(x) \end{array} \right| \end{split}$

Now consider the Wronskian of two solutions of a homogeneous ODE $$y'' + p(x)y' + q(x)y = 0$$:

$W[y_1, y_2] = y_1 y'_2 - y_2 y'_1$

Differentiating it

\begin{split} \begin{aligned} \frac{dW}{dx} &= y'_1 y'_2 + y_1 y''_2 - y'_2 y'_1 - y_2 y''_1 \\ &= y_1y''_2 - y_2 y''_1 \\ &= y_1(-p y'_2 - q u_2) - u_2 (-pu'_1 -qu_1) \\ &= -pW \end{aligned} \end{split}

In other words, the Wronskian satisfies the first-order linear ODE

$\frac{dW}{dx} + pW = 0$

whose solution is the Abel’s identity

$W(x) = W(x_0) e^{- \int_{x_0}^x p(s) ds}$

In general, the Wronskian of a set of functions $$\{f_1(x), f_2(x), \dots, f_n(x) \}$$ is defined by

$\begin{split} W[f_1, \dots, f_n] (x) := \left| \begin{array}{cccc} f_1 & f_2 & \cdots & f_n \\ f'_1 & f'_2 & \cdots & f'_n \\ \vdots & \vdots & & \vdots \\ f^{(n-1)}_1 & f^{(n-1)}_2 & \cdots & f^{(n-1)}_n \end{array} \right| \end{split}$

## Second-order linear ODE#

(119)#$y'' + p(x)y' + q(x)y = r(x)$

### Reduction of Order - homogeneous#

Often we can find one solution of an ODE from inspection, which we can then use to find the other linearly independent solution.

$\frac{d}{dx} \left( \frac{y_2}{y_1} \right) = \frac{y_1 y'_2 - y_2 y'_1}{y_1^2} = \frac{W[y_1, y_2]}{y_1^2}$

And integrating

$y_2(x) = y_1 (x) \left( C + \int_{x_0}^{x} \frac{W(s)}{y_1^2 (s)} ds \right)$

Can set C = 0 since we write the solution as a linear combination of u1 u2 anyway

### Variation of parameters - inhomogeneous#

Let the general solution of the corresponding homogeneous equation be known

(120)#$y_h(x) = A y_1(x) + B y_2(x)$

As we have done before, we use variation of parameters to set the ansatz for the particular solution:

(121)#$y_p(x) = C_1(x)y_1(x) + C_2(x)y_2(x)$

where we have “promoted” the constants $$C_1, C_2$$ to functions $$A(x), B(x)$$. Differentiating it once:

$y' = C'_1y_1 + C_1y'_1 + C'_2y_2 + C_2y'_2$

We also require that

(122)#$C'_1 y_1 + C'_2 y_2 = 0$

so eq. (158) becomes

(123)#$y' = C_1 y'_1 + C_2 y'_2.$

Differentiating again

(124)#$y'' = C'_1y'_1 + C_1y''_1 + C'_2y'_2 + C_2y''_2$

Now we substitute (161) and (162) into eq. (157):

$\begin{split} C'_1y'_1 + C_1y''_1 + C'_2y'_2 + C_2y''_2 + p(C_1 y'_1 + C_2 y'_2) + q(C_1y_1 + C_2y_2) = r \\ (C'_1y'_1 + C'_2y'_2) + \underbrace{C_1(y''_1 + py'_1 + qy_1) + C_2(y''_2 + py'_2 + qy_2)}_{= \ 0 \quad eq. (20)} = r \end{split}$

We get

(125)#$C'_1 y'_1 + C'_2 y'_2 = r$

The solution of the system of equations (159) and (163) is given by

(126)#$C'_1 = - \frac{y_2}{W[y_1, y_2]} r, \quad C'_2 = \frac{y_1}{W[y_1, y_2]} r$

and after integrating we get

(127)#$C_1(x) = - \int^x \frac{y_2(s)}{W(s)} r(s) \ ds , \quad C_2(x) = \int^x \frac{y_1(s)}{W(s)} r(s) \ ds$

where the choice of the lower bounds of integration is irrelevant since it only changes the coefficients $$A$$ and $$B$$ (see below). Therefore, the solution of the inhomogeneous second-order ODE (157) is

\begin{split} \begin{aligned} y(x) &= y_h(x) + y_p(x) \\ &= A y_1(x) + B y_2(x) + C_1(x)y_1(x) + C_2(x)y_2(x) \\ &= [A + C_1(x)]y_1(x) + [B + C_2(x)]y_2(x) \end{aligned} \end{split}

## General $$n$$-th order linear ODE#

We can directly extend the method of variation of parameters we used to solve second-order linear ODE to $$n$$-th order ODEs, which are of the form

(128)#$\mathcal{L}_x[y] = y^{(n)} + p_n(x) y^{(n-1)} + \cdots + p_1(x)y' + p_0(x)y = r(x)$

Let $$\{y_1, y_2, \dots, y_n\}$$ be solutions of the homogeneous ODE, i.e. $$\mathcal{L}[y_i] = 0$$. We seek solutions of the inhomogeneous ODE in the form of ansatz

(129)#$y_p(x) = \sum_{i=1}^n C_i(x)y_i(x)$

where $$C_i(x)$$ are functions that we need to find. As before, we differentiate and define conditions:

\begin{split} \begin{aligned} y' = \sum_{i=1}^n C'_iy_i + C_iy'_i, \quad & \text{assume:} \sum_{i=1}^n C'_iy_i = 0 \\ y'' = \sum_{i=1}^n C'_iy'_i + C_iy''_i, \quad & \text{assume:} \sum_{i=1}^n C'_iy'_i = 0 \\ \vdots \\ y^{(n-1)} = \sum_{i=1}^n C'_iy_i^{(n-2)} + C_iy_i^{(n-1)}, \quad & \text{assume:} \sum_{i=1}^n C'_iy_i^{(n-2)} = 0 \end{aligned} \end{split}
$y^{(n)} = \sum_{i=1}^n C'_i y_i^{(n-1)} + C_i y_i^{(n)}$

These $$n$$ equations define a system of linear equations with unknowns $$C_1, \dots, C_n$$:

(130)#$\begin{split} \begin{bmatrix} y_1 & y_2 & \cdots & y_n \\ y'_1 & y'_2 & \cdots & y'_n \\ \vdots & \vdots & & \vdots \\ y_1^{(n-1)} & y_2^{(n-1)} & \cdots & y_n^{(n-1)} \end{bmatrix} \begin{bmatrix} C'_1 \\ C'_2 \\ \vdots \\ C'_n \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ r \end{bmatrix} \end{split}$

A sufficient condition for the existence of a solution of the system of equations (130) is that the determinant of coefficients is nonzero for each value of $$x$$. This is equal to the Wronskian $$W[y_1, \dots, y_n]$$ and we know it is nonzero for all $$x$$ because the set $$\{y_1, \dots, y_n \}$$ is linearly independent. We determine $$C'_1, \dots, C'_n$$ using Cramer’s rule:

$C'_1(x) = \frac{W_i (x)}{W(x)}r(x)$

where $$W_i$$ is the determinant obtained from $$W$$ by replacing $$i$$-th column with the RHS column $$(0, 0, \dots, r)^T$$. We integrate this and substitute the result in eq. (129) to get the particular solution of (188)

$y_p = \sum_{i=1}^n y_i(x) \int_{x_0}^x \frac{W_i (s)}{W(s)}r(s) \ ds$

where $$x_0$$ is arbitrary.

## Linear ODEs with Constant Coefficients#

In this section we focus on second-order ODEs where functions $$p(x), q(x)$$ and $$r(x)$$ are now constants. We therefore write Eq. inhmg2ndode as

(131)#$y^{(n)} + a_{n-1}y^{(n-1)} + \cdots + a_1y' + a_0y = 0$

### Homogeneous case - method of characteristic polynomial#

We use the method of characteristic polynomial which can be used for any such ODE of any order. A homogeneous linear ODE of first order with constant coefficient $$a_0$$

$y' + a_0 y = 0$

is separable and its integral is $$y(x) = Ce^{\lambda x}$$, where $$\lambda$$ is an unknown constant. We find it by plugging $$y$$ back into the ODE:

$\begin{split} \lambda Ce^{\lambda x} + a_0 Ce^{\lambda x} = 0 \\ \lambda + a_0 = 0\end{split}$

This is the characteristic equation of the ODE. We find $$\lambda = -a_0$$.

Consider now a second-order ODE with constant coefficients $$a_1, a_0$$

$y'' + a_1 y' + a_0 y = 0$

Inspired by the solution of the first-order equation, we assume the solution of the form $$y(x) = Ce^{\lambda x}$$. We plug it back into the ODE and get the characteristic equation

$\lambda^2 + a_1 \lambda + a_0 = 0$

The roots of this quadratic equation are

$\lambda_{1,2} = \frac{-a_1 \pm \sqrt{D}}{2}, \qquad D = a_1^2 - 4a_0$

so we have three cases depending whether $$D > 0, D = 0$$ or $$D < 0$$, keeping in mind that we need linearly independent solutions to form the basis:

1. $$\lambda_{1,2} \in \mathbb{R}, \lambda_1 \neq \lambda_2: \quad y(x) = C_1 e^{\lambda_1 x} + C_2e^{\lambda_2 x}$$

2. $$\lambda_{1,2} \in \mathbb{R}, \lambda_1 = \lambda_2: \quad y(x) = (C_1 + C_2 x)e^{\lambda_1 x}$$

3. $$\lambda_{1,2} \in \mathbb{C}, \lambda_2 = \lambda_1^*: \quad y(x) = C_1 e^{\lambda_1 x} + C_2e^{\lambda_1^* x}$$

Hint: Diagonalisation

The method of characteristic polynomial might remind us of the process of finding eigenvalues. Indeed, we can think of finding solutions in the form of an exponential function as a type of diagonalisation.

$\frac{d}{dx} e^{\lambda x} = \lambda e^{\lambda x}$

The exponential function can be thought of as an eigenvector of the derivative operator and $$\lambda$$ as the eigenvalue. If $$\lambda_1, \lambda_2$$ are roots of the characteristic equation, we can write

$\left( \frac{d}{dx} - \lambda_1 \right) \left( \frac{d}{dx} - \lambda_2 \right) y(x) = 0$

Let us now generalise this to an $$n$$-th order ODE. The characteristic equation of eq. (131) is again obtained by substituting $$y = e^{\lambda x}$$:

$\lambda^n + a_{n-1}\lambda^{n-1} + \cdots + a_1 \lambda + a_0 = 0$
1. For any distinct real root $$\lambda$$, one solution is $$y = e^{\lambda x}$$

2. For any complex root $$\lambda = \gamma + i \omega$$, its conjugate $$\lambda^* = \gamma - i \omega$$ is also a root:

$y_1 = e^{\gamma x} \cos \omega x, \quad y_2 = e^{\gamma x} \sin \omega x$
1. Multiple real roots: if a real root is repeated $$m$$ times, the $$m$$ corresponding linearly independent solutions are

$e^{\lambda x}, \ xe^{\lambda x}, \ \dots \ , x^{m-1}e^{\lambda x}$
1. Multiple complex roots: if a complex root $$\lambda = \gamma + i \omega$$ is repeated, the corresponding linearly independent solutions are

$e^{\gamma x} \cos \omega x, \quad e^{\gamma x} \sin \omega x, \quad x e^{\gamma x} \cos \omega x, \quad xe^{\gamma x} \sin \omega x, \ \dots$