# The Relaxation Method

This chapter involves solving boundary value problems (BVPs) using linear algebra. You'll find it easier if you have at least a working knowledge of matrix multiplication and inversion. It's advisable, however, to use a software library to do the work for you. For example, if you prefer to program in Python, NumPy or SciPy will do.

If you have time on your hands, Professor Gilbert Strang's lecture series are available on MIT Open Courseware.

## Boundary Value Problem

Take a second order linear ODE in the form:

$$\frac{d^2 f(x)}{dx^2} = p(x)f(x) + q(x)\frac{df(x)}{dx}+ r(x) \label{bvp_ode}$$

Given values $f(a)$ and $f(b)$, solve for the interval $f(a < x < b)$.

First, discretise the above interval by selecting $\Delta x = \frac{b - a}{N}$ for $N > 0$. Let $x_i = a + \Delta x i$ for $0 \leq i \leq N$.

Recall the central finite differences for approximating the first and second derivatives of $f$.

\eqalign{ f'(x_i) &= \frac{f(x_{i+1}) - f(x_{i-1})}{2\Delta x} + O(\Delta x^2)\cr f''(x_i) &= \frac{f(x_{i+1}) + f(x_{i-1}) - 2f(x_i)}{\Delta x^2} + O(\Delta x^2) } \label{bvp_cfd}

Substitute \eqref{bvp_cfd} into \eqref{bvp_ode} to get the following system of equations:

$$\frac{f_{i+1} + f_{i-1} - 2f_i}{\Delta x^2} = p_i f_i + q_i \frac{f_{i+1} - f_{i-1}}{2 \Delta x} + r_i \label{bvp_algebra}$$

Equation \eqref{bvp_algebra} holds for $i \in {[1,N-1]}$, which gives $N - 1$ equations. Both $f_0 = f(a)$ and $f_N = f(b)$ are given, leaving $N-1$ unknowns for $N-1$ equations.

## Solving the Equations

There are many ways to solve a system of equations like the one in equation \eqref{bvp_algebra}. Here is one.

Let:

\eqalign{ \alpha_i &= 1 + \frac{\Delta x}{2} q_i \cr \beta_i &= -2 - \Delta x^2 p_i \cr \gamma_i &= 1 - \frac{\Delta x}{2} q_i }

Rewrite equation \eqref{bvp_algebra} for $i \in {[1,N-1]}$:

$$\alpha_i f_{i-1} + \beta_i f_i + \gamma_i f_{i+1} = \Delta x^2 r_i \label{bvp_rewrite}$$

Remember the boundary conditions $f_0$ and $f_N$ are given. Simplify for $i = 1$:

$$\beta_1 f_1 + \gamma_1 f_2 = \Delta x^2 r_1 - \alpha_1 f_0 \label{i_one}$$

and for $i = N - 1$:

$$\alpha_{N-1} f_{N-2} + \beta_{N-1} f_{N-1} = \Delta x^2 r_{N-1} - \gamma_{N - 1} f_N \label{i_Ntakeone}$$

Now construct the following matrices from the system of equations in \eqref{bvp_rewrite}. Let matrix $M$ be a tridiagonal matrix whose values are the left hand side coefficients. Let $c$ be a column vector containing the right hand side. Let $f$ be a column vector containing $f_i$ for $0 < i < N$.

$$M = \begin{pmatrix} \beta_1 & \gamma_1 & 0 & 0 & \cdots &0 \cr \alpha_2 & \beta_2 & \gamma_2 & 0 & \cdots & 0 \cr 0 & \alpha_3 & \beta_3 & \gamma_3 & \cdots & 0 \cr \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \cr 0 & 0 & \dots & \alpha_{N-2} & \beta_{N-2} & \gamma_{N-2} \cr 0 & 0 & 0 & \dots & \alpha_{N-1} & \beta_{N-1} \cr \end{pmatrix}, c = \begin{pmatrix} \Delta x^2 r_1 - \alpha_1 f_0 \cr \Delta x^2 r_2 \cr \vdots \cr \Delta x^2 r_{N-2} \cr \Delta x^2 r_{N-1} - \gamma_{N-1}f_N \end{pmatrix}, f = \begin{pmatrix} f_1 \cr f_2 \cr \vdots \cr f_{N-2} \cr f_{N-1} \end{pmatrix}$$

Solving $Mf = c$ for $f$:

$$f = M^{-1}c \label{solve_bvp}$$