Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/August 2009

Symmetry
$$ \begin{align} \langle p, q \rangle &= ( p(A)v_1, q(A)v_1 )\\ &= ( q(A)v_1, p(A)v_1) \\ &= \langle q, p \rangle \end{align}

\!\,$$

Linearity of 1st Argument
Let $$\alpha \in \mathbb{R} \!\,$$

$$ \begin{align} \langle \alpha p, q \rangle &= ( \alpha p(A)v_1, q(A)v_1) \\ &= \alpha (p(A)v_1, q(A)v_1) \\ &= \alpha \langle p, q \rangle \end{align} \!\,$$

$$ \begin{align} \langle p+r, q \rangle &= ( (p(A)+r(A))v_1, q(A)v_1 ) \\ &= ( p(A)v_1, q(A)v_1) + (r(A)v_1, q(A)v_1) \\ &= \langle p, q \rangle + \langle r, q \rangle \end{align} \!\,$$

Positive Definiteness
$$ \begin{align} \langle p, p \rangle &= ( \underbrace{p(A)v_1}_{\hat{v}}, \underbrace{p(A)v_1)}_{\hat{v}} \mbox{ where } \hat{v} \in R^n \\ &= (\hat{v}, \hat{v} ) \\ &= \sum_{i=1}^n (\hat{v}_i)^2 \geq 0 \end{align} \!\,$$

"Zeroness"
We also need to show that $$ \langle p, p \rangle =0 \!\,$$ if and only if $$p =0 \!\,$$.

Forward Direction (alt)
Suppose $$ p\neq 0$$. It suffices to show $$\langle p,p\rangle\neq 0$$. However, this a trivial consequence of the fact that $$ p(A)v_1\neq 0$$ (which is clear from the fact that $$ p(A)\neq 0$$ for $$p\neq 0$$ with degree less than $$n$$ and that $$ v_i$$ does not lie in the orthogonal compliment of any of the $$ n$$ distinct eigen vectors of $$ A$$).

Forward Direction
Claim: If $$\langle p, p \rangle =0 \!\,$$, then $$p=0 \!\,$$.

From hypothesis

$$v_1 = \sum_{i=1}^n \alpha_i u_i \!\,$$

where $$u_i \!\,$$ are the orthogonal eigenvectors of $$A \!\,$$ and all $$\alpha_i \!\,$$ are non-zero

$$ \begin{align} \langle p, p, \rangle &= (p(A)v_1, p(A)v_1) \\ &= (p(A) \sum_{i=1}^n \alpha_i u_i, p(A) \sum_{i=1}^n \alpha_i u_i) \\ &= ( \sum_{i=1}^n \beta_i u_i, \sum_{i=1}^n \beta u_i) \mbox{ (since } u_i \mbox{ are eigenvectors )} \\ &= 0 \mbox{ (by hypothesis) } \end{align} \!\,$$

Notice that $$\beta_i \!\,$$ is a linear combination of $$\alpha_i \!\,$$, the coefficients of the polynomial $$A \!\,$$, and the scaling coefficient $$m_i \!\,$$ of the eigenvector e.g. $$A u_i = m_i u_i \!\,$$

Since $$u_i \neq 0 \forall i \!\,$$ and $$\alpha_i \neq 0 \!\,$$, this implies $$p =0 \!\,$$.

Reverse Direction
If $$p=0 \!\,$$, then $$\langle p, p \rangle = (\underbrace{p(A)}_0v_1, \underbrace{p(A)}_0v_1)=0 \!\,$$

Solution 1b
By induction.

Base Case
$$ \begin{align} \beta_2 v_2 &= Av_1 - \alpha_1 v_1 - \beta_1 \underbrace{v_0}_0 \\ v_2 &= \frac{1}{\beta_2} [Av_1 -\alpha_1 v_1 ] \\ v_2 &= p_1 (A) v_1 \quad p_1(t)=\frac{1}{\beta_2}[t-\alpha_1] \end{align} \!\,$$

$$ \begin{align} \beta_3 v_3 &= Av_2 - \alpha_2 v_2 - \beta_2 v_1 \\ v_3 &= \frac{1}{\beta_3}[ \underbrace{Av_2}_{ \in p_2(A)v_1} - \underbrace{\alpha_2 v_2}_{\in p_1(A)v_1} - \underbrace{\beta_2 v_1}_{\in p_0(A)v_1}] \\ &\in p_{2}(A)v_1 \end{align} \!\,$$

Induction Step
Claim: $$v_j = p_{j-1}(A) v_1 \!\,$$

Hypothesis:

Suppose

$$ v_{j} = p_{j-1}(A)v_1 \!\,$$

$$ v_{j-1} = p_{j-2}(A)v_1 \!\,$$

where $$p_{j-1}$$ (respectively $$p_{j-2}$$) has degree $$j-1$$ (respectively $$j-2$$). Then for $$\beta_{j+1}\neq 0$$

$$ v_{j+1} = \frac{A p_{j-1}(A) - \alpha_j p_{j-1}(A) - \beta_j p_{j-2}(A)}{\beta_{j+1}} v_1 $$

which is as desired.

Solution 1c
Since $$v_j = p_{j-1}(A)v_1 \!\,$$ and $$\langle p, q \rangle \equiv (p(A)v_1, q(A)v_1) \!\,$$, it is equivalent to show that $$(v_i, v_j)=0 \!\,$$ for $$i \neq j \!\,$$.

Since


 * $$v_{j+1}=\frac{1}{\beta_{j+1}}Av_j - \frac{\alpha_j}{\beta_j}v_j - \frac{\beta_{j+1}}{\beta_j}v_{j-1} \!\,$$,

it is then sufficient to show that

$$(v_{j+1}, v_j)=(v_{j+1},v_{j-1})=0 \!\,$$

Claim $$ (v_{j+1},v_j)=0 \quad \forall j \!\,$$
By induction.

Base Case
$$ \begin{align} v_2 &= \frac{1}{\beta_2}(Av_1 - \alpha_1 v_1) \\ (v_2,v_1) &= \frac{1}{\beta_2}[(Av_1,v_1)-\underbrace{\alpha_1}_{(Av_1,v_1)} \underbrace{(v_1,v_1)}_1] \\ &= 0 \end{align} \!\,$$

Induction Step
Assume: $$(v_j,v_{j-1})=0 \!\,$$

Claim: $$(v_{j+1},v_j)=0 \!\,$$

$$ \begin{align} v_{j+1} &=\frac{1}{\beta_{j+1}}[Av_j-\alpha_jv_j - \beta_jv_{j-1}] \\ (v_{j+1},v_j)&=\frac{1}{\beta_{j+1}}[(Av_j,v_j)-\underbrace{\alpha_j}_{(Av_j,v_j)}\underbrace{(v_j,v_j)}_1-\beta_j \underbrace{(v_{j-1},v_j)}_0]\\ &=0 \end{align} \!\,$$

Claim $$(v_{j+1},v_{j-1})=0 \!\,$$
By induction.

Base Case
$$ \begin{align} v_3 &= \frac{1}{\beta_3}[Av_2-\alpha_2v_2-\beta_2v_1] \\ (v_3,v_1) &= \frac{1}{\beta_3}[(Av_2,v_1)-\alpha_2\underbrace{(v_2,v_1)}_0-\beta_2\underbrace{(v_1,v_1)}_1] \\ &= \frac{1}{\beta_3}[(Av_2,v_1)-\beta_2] \\ &= \frac{1}{\beta_3}[(v_2,Av_1)-\beta_2] \mbox{ (because A symmetric) } \\ &= \frac{1}{\beta_3}[ \beta_2 -\beta_2] =0 \mbox{ (see below) } \end{align} \!\,$$

$$ \begin{align} \beta_2v_2 &= Av_1-\alpha_1 v_1 \\ \beta_2\underbrace{(v_2,v_2)}_1 &= (Av_1,v_2) -\alpha_1 \underbrace{(v_1, v_2)}_0 \\ \beta_2 &= (Av_1,v_2) \end{align}

\!\,$$

Induction Step
Assume: $$(v_j,v_{j-2})=0 \!\,$$

Claim: $$(v_{j+1},v_{j-1})=0\!\,$$

$$ \begin{align} (v_{j+1},v_{j-1}) &= \frac{1}{\beta_{j+1}}[ (Av_j,v_{j-1}-\alpha_j \underbrace{(v_j,v_{j-1})}_0 - \beta_j \underbrace{(v_{j-1},v_{j-1})}_1 \\                 &= \frac{1}{\beta_{j+1}} [ (Av_j,v_{j-1})-\beta_j) ] \\ &= \frac{1}{\beta_{j+1}} [ (v_j, Av_{j-1})-\beta_j ] \mbox{ (because A symmetric)} \\ &= \frac{1}{\beta_{j+1}} [ \beta_j - \beta_j ] = 0 \mbox{ (see below) } \end{align} \!\,$$

$$ \begin{align} \beta_jv_j &= Av_{j-1}-\alpha_{j-1}v_{j-1}-\beta_{j-1}v_{j-2} \\ \beta_j \underbrace{(v_j,v_j)}_1 &= (Av_{j-1},v_j) -\alpha_{j-1}\underbrace{(v_{j-1},v_j)}_0 - \beta_{j-1}\underbrace{(v_{j-2},v_j)}_0 \\ \beta_j &= (Av_{j-1},v_j) \end{align} \!\,$$

Rewrite given equation on specific interval
For a specific interval $$[t_k, t_{k+1}] \!\,$$, we have from hypothesis

$$(\frac{t_{k+1}-t_k}{2}) (f(t_k)+f(t_{k+1}))- \int_{t_k}^{t_{k+1}} f(t) = \int_{t_k}^{t_{k+1}} G(t) f'(t) dt \!\,$$.

Distributing and rearranging terms gives

$$ (1) \qquad (\frac{t_{k+1}-t_k}{2})f(t_k)+(\frac{t_{k+1}-t_k}{2})f(t_{k+1}))-\int_{t_k}^{t_{k+1}} f(t) = \int_{t_k}^{t_{k+1}}G(t)f'(t) \!\,$$

Apply Hint
Starting with the hint and applying product rule, we get

$$ \int_{t_k}^{t_{k+1}} (f(t)G(t))'dt = \int_{t_k}^{t_{k+1}} f'(t)G(t)dt + \int_{t_k}^{t_{k+1}} f(t)G'(t)dt \!\,$$.

Also, we know from the Fundamental Theorem of Calculus

$$ \int^{t_{k+1}}_{t_k} (f(t)G(t))'dt = f(t_{k+1})G(t_{k+1}) - f(t_k)G(t_k) \!\,$$.

Setting the above two equations equal to each other and solving for $$ \int_{t_k}^{t_{k+1}} f'(t) G(t) \!\,$$ yields

$$ (2) \qquad - G(t_k)f(t_k) + G(t_{k+1})f(t_{k+1}) - \int_{t_k}^{t_{k+1}} G'(t)f(t)dt = \int_{t_k}^{t_{k+1}} G(t)f'(t)\!\,$$

Choose G'(t)
Let $$ G'(t) = 1 \!\,$$. Therefore, since $$G(t) \!\,$$ is linear

$$ (3) \qquad G(t) = t + b \!\,$$

By comparing equations (1) and (2) we see that

$$ G(t_{k+1}) = \frac{t_{k+1}-t_{k}}{2}\!\,$$ and

$$G(t_k) = - \frac{t_{k+1}-t_{k}}{2}\!\,$$.

Plugging in either $$ G(t_{k})\!\,$$ or $$ G(t_{k+1})\!\,$$ into equation (3), we get that

$$ b = -\frac{t_{k+1} + t_k}{2}\!\,$$

Hence

$$ G(t) = t - \frac{t_{k+1} + t_k}{2}\!\,$$

Solution 3a
$$ \begin{align} &\quad \int_a^b \rho(x) r_j(x) r_k(x) dx \\ &= \int_a^b \rho(x) \prod_{i\neq j} \frac{x-x_i}{x_j-x_i} \prod_{i \neq k} \frac{x-x_i}{x_k-x_i} dx \\ &= \int_a^b \overbrace{\frac{\prod_{i\neq k,i\neq j}^n x-x_i }{\prod_{i \neq j}^n x_j-x_i}}^{Q_{n-2}} \overbrace{\frac{\prod_{i=1}^n x-x_i}{\prod_{i \neq k}^n x_k -x_i}}^Q_n dx\\ &= 0

\end{align} \!\,$$

Solution 3b
$$ \begin{align} &\quad \sum_{k=1}^n \int_a^b \rho(x) (r_k(x))^2 dx \\ &= \int_a^b \rho(x) \sum_{k=1}^n (r_k(x))^2 dx \\ &= \int_a^b \rho(x) (\sum_{k=1}^n r_k (x) )^2 dx \mbox{ (from part a) }\\ &= \int_a^b \rho(x) (1 )^2 dx \mbox{ (from claim) }\\ &= \int_a^b \rho(x) dx \end{align} \!\,$$

Claim
$$ \sum_{k=1}^n r_k(x) =1 \!\,$$

Proof
Since $$r_k \!\,$$ is a polynomial of degree $$n-1 \!\,$$ for all $$k \!\,$$, $$\sum_{k=1}^n r_k(x) \!\,$$ is a polynomial of degree $$n-1 \!\,$$.

Notice that $$\sum_{k=1}^n r_k(x_i)=1 \!\,$$ for $$i=1,2,\ldots,n \!\,$$ where $$x_i \!\,$$ are the $$n \!\,$$ distinct roots of $$Q_n \!\,$$. Since $$\sum_{k=1}^n r_k (x) \!\,$$ is a polynomial of degree $$n-1 \!\,$$ and takes on the value 1, $$n \!\,$$ distinct times

$$ \sum_{k=1}^n r_k(x) =1 \!\,$$