12

Question: Let $V\ $ be the vector space of the polynomials over $\mathbf{R}$ of degree less than or equal to 3, with the inner product $$ (f|g) = \int_0^1 f(t)g(t) dt. $$ If $t$ is a real number, find the polynomial $g_t$ in $V$ such that $(f|g_t) = f(t)$ for all $f$ in $V$.

My Attempt: The way I thought to do it was, let $f(x) = a_0 + a_1x + a_2x^2 + a_3x^3$ and $g_t(x) = b_0 + b_1x + b_2x^2 + b_3x^3$. $$(f|g_t) = \sum_{j, k} \frac{1}{1 + j + k} a_j b_k $$

Since $(f|g_t) = f(t)$, I get $$t^j = \sum_k \frac{1}{1 + j + k}b_k.$$

Let $A$ be the matrix $A_{kj} = \frac{1}{1 + j + k}$, so $$ (b_0, b_1, b_2, b_3)A = (1, t, t^2, t^3) $$

Thus $$(b_0, b_1, b_2, b_3) = (1, t, t^2, t^3)A^{-1}.$$

I can compute $A^{-1}$ and that would give me the answer, I think, but it seems like a lot of work, and I would not be using any of the information from the chapter to solve it. I am assuming there is a lot easier way to do this.

The chapter is called "Linear Functionals and Adjoints" from Linear Algebra by Hoffman and Kunze.

EDIT: I think the way the chapter wanted me to do this was the following.

Find an orthonormal basis using Gram Schmidt, say $f_1, f_2, f_3, f_4$. Then let $L_t(f) = f(t)$.

We can then let $$g_t = L_t(f_1)f_1 + L_t(f_2)f_2 + L_t(f_3)f_3 + L_t(f_4)f_4.$$

Then say $f = a_1f_1 + a_2f_2 + a_3f_3 + a_4f_4$.

$$ \begin{align*} (f| g_t) &= a_1L_t(f_1)(f_1| f_1) + a_2L_t(f_2)(f_2| f_2) + a_3L_t(f_3)(f_3| f_3) + a_4L_t(f_4)(f_4| f_4) \\ &= L_t(a_1f_1 + a_2f_2 + a_3f_3 + a_4f_4) = L_t(f) = f(t). \end{align*}$$

The computation is still more than I want to do, but the ideas are all there. I guess this was more focused on the linear functional part of the chapter, instead of the adjoint part.

zrbecker
  • 4,048
  • 2
    The trick is that you (hopefully) know the determinant of a Cauchy matrix. Now, not only the matrix $A$ itself, but also each of its minors is a Cauchy matrix, and thus computing the adjoint of $A$ is easy using the formula for the determinant of a Cauchy matrix. Now that you have the adjoint of $A$, you can get the inverse $A^{-1}$ by dividing it through the determinant of $A$ (which is, as I said, a determinant of a Cauchy matrix as well). This simplifies the computation of $A^{-1}$ a lot. Of course, nobody forces you to use adjoints (unless you do the degree $n$ generalization!). – darij grinberg Dec 03 '11 at 23:35
  • 1
    I'll read up on Cauchy matrices. I have never heard of them before. – zrbecker Dec 03 '11 at 23:38
  • 1
    More specifically, what you have is a special case of a Cauchy matrix, called the Hilbert matrix. – J. M. ain't a mathematician Dec 04 '11 at 03:32
  • Yes, but the minors won't be Hilbert matrices (in general). – darij grinberg Dec 04 '11 at 05:26
  • Thanks, I found another way to go about the problem. I think my mistake was focusing too much on the chapter title about adjoints. I am sure I can go about it the Cauchy matrix way, but I don't really understand it as of right now. – zrbecker Dec 04 '11 at 08:07

1 Answers1

3

This is a classic application of the Riesz Representation Theorem in a finite dimensional setting. For clarity, let's restate the theorem in this context. (Google for more general versions.)

Riesz Representation Theorem: Let $V$ be a finite dimensional vector space over $\mathbb{R}$ and $\langle\cdot,\cdot\rangle$ be an inner product on $V$. Then for every linear functional $\ell:V\to\mathbb{R}$, there is a unique $g_\ell\in V$ such that $\ell(f)=\langle f,g_\ell\rangle$ for all $f\in V$.

In other words, under certain assumptions, every linear functional can be "represented" (uniquely) as an inner product of the input against some "special" (but fixed) member of $V$.

In your problem, $V=\mathbb{P}_3$, we have the standard (real) inner product, and you are looking for the Riesz "representer" $g_\ell$ for the so-called evaluation functional given by $\ell(f):=f(t_0)$, where $t_0\in[0,1]$ is arbitrary but fixed.

So how do we determine the unique Riesz representer $g_\ell$? To answer this, let $\{e_1,\dots,e_n\}$ be an orthonormal basis for $V$. (For example, pick your favorite basis for $V$, then Gram-Schmidt it to obtain an orthonormal basis.)

Claim: $g_\ell=\sum_{i=1}^n \ell(e_i)e_i$ is the (unique) Riesz representer for the linear functional $\ell$, i.e., $\ell(f)=\langle f,g_\ell \rangle$ for all $f\in V$.

To verify the claim, let $f\in V$. Then we can write $f=\sum_{i=1}^n c_ie_i$ and $$\langle f,g_\ell\rangle = \left\langle \sum_{i=1}^n c_i e_i,\sum_{i=1}^n \ell(e_i)e_i\right\rangle= \sum_{i=1}^n c_i \ell(e_i)\langle e_i,e_i\rangle = \sum_{i=1}^n c_i \ell(e_i) = \ell\left(\sum_{i=1}^n c_ie_i\right) = \ell(f).$$ (I'll leave uniqueness to you.)

Again, to bring all of this back to your particular context, you will need an orthonormal basis for $\mathbb{P}_3$ (for example, the shifted Legendre polynomials) and need to recognize that the particular functional in your question is indeed the evaluation functional.

Hope that helps.

JohnD
  • 14,392