Ordinary Differential Equations/One-dimensional first-order linear equations

Definition
One-dimensional first-order inhomogenous linear ODEs are ODEs of the form
 * $$x'(t) + f(t) x(t) = g(t)$$

for suitable (that is, mostly, continuous) functions $$f, g: \mathbb R \to \mathbb R$$; note that when $$g \equiv 0$$, we have a homogenous equation instead.

General solution
First we note that we have the following superposition principle: If we have a solution $$x_h$$ ("$$h$$" standing for "homogenous") of the problem
 * $$x_h'(t) + f(t) x_h(t) = 0$$

(which is nothing but the homogenous problem associated to the above ODE) and a solution to the actual problem $$x_p$$; that is a function $$x_p$$ such that
 * $$x_p'(t) + f(t) x_p(t) = g(t)$$

("$$p$$" standing for "particular solution", indicating that this is only one of the many possible solutions), then the function
 * $$x(t) := a x_h(t) + x_p(t)$$ ($$a \in \mathbb R$$ arbitrary)

still solves $$x'(t) + f(t) x(t) = g(t)$$, just like the particular solution $$x_p$$ does. This is proved by computing the derivative of $$x$$ directly.

In order to obtain the solutions to the ODE under consideration, we first solve the related homogenous problem; that is, first we look for $$x_h$$ such that
 * $$x_h'(t) + f(t) x_h(t) = 0 \Leftrightarrow x_h' = - f(t) x_h$$.

It may seem surprising, but this gives actually a very quick path to the general solution, which goes as follows. Separation of variables (and using $$\ln^{-1} = \exp$$) gives
 * $$x_h(t) = \exp\left( - \int_{t_0}^t f(s) ds \right)$$,

since the function
 * $$G(t) := - \int_{t_0}^t f(s) ds$$

is an antiderivative of $$t \mapsto -f(t)$$. Thus we have found the solution to the related homogenous problem.

For the determination of a solution $$x_p$$ to the actual equation, we now use an Ansatz: Namely we assume
 * $$x_p(t) = c(t) x_h(t)$$,

where $$c: \mathbb R \to \mathbb R$$ is a function. This Ansatz is called variation of the constant and is due to Leonhard Euler. If this equation holds for $$x_p$$, let's see what condition on $$c$$ we get for $$x_p$$ to be a solution. We want
 * $$x_p'(t) + f(t) x_p(t) = g(t)$$, that is (by the product rule and inserting $$x_h$$):
 * $$c'(t) \exp\left( - \int_{t_0}^t f(s) ds \right) = c'(t) \exp\left( - \int_{t_0}^t f(s) ds \right) + c(t) (-1) \exp\left( - \int_{t_0}^t f(s) ds \right) f(t) + f(t) c(t) \exp\left( - \int_{t_0}^t f(s) ds \right) = g(t)$$.

Putting the exponential on the other side, that is
 * $$c'(t) = g(t) \exp\left( \int_{t_0}^t f(s) ds \right)$$

or
 * $$c(t) = \int_{t_0}^t g(r) \exp\left( \int_{t_0}^r f(s) ds \right) dr + C_1$$.

Since all the manipulations we did are reversible, all functions of the form
 * $$C_2 \exp\left( - \int_{t_0}^t f(s) ds \right) + \left( \int_{t_0}^t g(r) \exp\left( \int_{t_0}^r f(s) ds \right) dr + C_1 \right) \exp\left( - \int_{t_0}^t f(s) ds \right)$$ ($$C_1, C_2 \in \mathbb R$$ arbitrary)

are solutions. If we set $$C := C_2 + C_1$$, we get the general solution form
 * $$C \exp\left( - \int_{t_0}^t f(s) ds \right) + \left( \int_{t_0}^t g(r) \exp\left( \int_{t_0}^r f(s) ds \right) dr \right) \exp\left( - \int_{t_0}^t f(s) ds \right)$$.

We want now to prove that these constitute all the solutions to the equation under consideration. Thus, set
 * $$x_C(t) := C \exp\left( - \int_{t_0}^t f(s) ds \right) + \left( \int_{t_0}^t g(r) \exp\left( \int_{t_0}^r f(s) ds \right) dr \right) \exp\left( - \int_{t_0}^t f(s) ds \right)$$

and let $$x_2(t)$$ be any other solution to the inhomogenous problem under consideration. Then $$x_C - x_2$$ solves the homogenous problem, for
 * $$x_C'(t) - x_2'(t) - f(t) (x_C(t) - x_2(t)) = x_C'(t) - f(t) x_C(t) - (x_2'(t) - f(t) x_2(t)) = g(t) - g(t) = 0$$.

Thus, if we prove that all the homogenous solutions (and in particular the difference $$x_C - x_2$$) are of the form
 * $$C \exp\left( - \int_{t_0}^t f(s) ds \right)$$,

then we may subtract
 * $$D \exp\left( - \int_{t_0}^t f(s) ds \right)$$

from $$x_C - x_2$$ for an appropriate $$D \in \mathbb R$$ to obtain zero, which is why $$x_2$$ is then of the desired form.

Thus, let $$x_h$$ be any solution to the homogenous problem. Consider the function
 * $$t \mapsto x_h(t) \cdot \exp\left( \int_{t_0}^t f(s) ds \right)$$.

We differentiate this function and obtain by the product rule
 * $$x_h'(t) \exp\left( \int_{t_0}^t f(s) ds \right) + f(t) \exp\left( \int_{t_0}^t f(s) ds \right) x_h(t) = -f(t) x_h(t) \exp\left( \int_{t_0}^t f(s) ds \right) + f(t) \exp\left( \int_{t_0}^t f(s) ds \right) x_h(t) = 0$$

since $$x_h$$ is a solution to the homogenous problem. Hence, the function is constant (that is, equal to a constant $$C \in \mathbb R$$), and solving
 * $$x_h(t) \cdot \exp\left( \int_{t_0}^t f(s) ds \right) = C$$

for $$x_h$$ gives the claim.

We have thus arrived at:

Note that imposing a condition $$x(t_0) = x_0$$ for some $$x_0 \in \mathbb R$$ enforces $$C = x_0$$, whence we got a unique solution for each initial condition.

Exercises

 * Exercise 3.2.1: First prove that $$\frac{d}{dt} \ln(t^2) = \frac{2}{t}$$. Then solve the ODE $$x'(t) + \frac{2}{t} x(t) = \frac{1}{t^2}$$ for a function existent on $$[1, \infty)$$ such that $$x(1) = c$$ for $$c \in \mathbb R$$ arbitrary. Use that a similar version of theorem 3.1 holds when $$f, g$$ are only defined on a proper part of $$\mathbb R$$; this is because the proof carries over.

Clever Ansatz for polynomial RHS
First note that RHS means "Right Hand Side". Let's consider the special case of a 1-dim. first-order linear ODE
 * $$x'(t) + c x(t) = a_i t^i$$ ($$c \in \mathbb R$$ arbitrary),

where we used Einstein summation convention; that is, $$a_i x^i$$ stands for $$\sum_{i=0}^m a_i t^i$$ for some $$m \in \mathbb N$$. In the notation of above, we have $$f(t) \equiv c$$ and $$g(t) = a_i t^i$$.

Using separation of variables, the solution to the corresponding homogenous problem $$g \equiv 0$$ is easily seen to equal $$x_h(t) = C \exp(-c t)$$ for some capital $$C \in \mathbb R$$.

To find a particular solution $$x_p$$, we proceed as follows. We pick the Ansatz to assume that $$x_p$$ is simply a polynomial; that is
 * $$x_p(t) = b_i t^i$$

for certain coefficients $$b_i$$.

Exercises

 * Exercise 3.3.1: Find all solutions to the ODE $$x'(t) + 2x(t) = 2t^2 + 4t + 3$$. (Hint: What does theorem 3.1 say about the number of solutions to that problem with a given fixed initial condition?)