6

The spectrum of a linear operator $L: \mathcal{D}(L) \rightarrow \mathcal{X} $ is generally defined for $\mathcal{X}$ a Banach space (as seen for example wikipedia on link above, or spectral decomposition on wikipedia, or this question or this answer).

Why is this? Why don't we define the spectrum more generally for operators between normed spaces? Where do we need the completeness?

glS
  • 6,818
  • 3
    I don't see how one would define the spectrum of an operator $L\colon \mathcal{X}\to\mathcal{Y}$ unless $\mathcal{X}\subset\mathcal{Y}$, for you consider $\lambda\operatorname{id} - L$ to define the spectrum. Anyway, without completeness, a continuous bijection need not have a continuous inverse, and a linear map with closed graph need not be continuous. I guess these things would make the theory for incomplete spaces awkward. – Daniel Fischer Sep 29 '14 at 22:09
  • 1
    To add on to @DanielFischer's comment: you could have a bounded operator on a Hilbert space that is invertible on a dense subset but not on the whole Hilbert space (this is very intimately related to the continuous spectrum). So in some sense you could have false negatives for spectral values. – Cameron Williams Sep 29 '14 at 22:29
  • corrected the post to have $L$ defined on a single space. @CameronWilliams could you show an example of such an operator? – glS Sep 30 '14 at 05:13

2 Answers2

6

Considering the spectrum of unbounded operators as a very special definition that doesn't quite fit into the general definition of the spectrum as follows...


The basic ingredient for spectral theory in general is a unital algebra $1\in\mathcal{A}$.

Denote the set of invertibles by $\mathcal{A}^\ast$.

Then, the spectrum of an element is nothing but: $$A\in\mathcal{A}:\quad\sigma(A):=\{\lambda\in\mathbb{C}:A-\lambda 1\notin\mathcal{A}^\ast\}$$


Now, the bounded linear operators on a Banach space form a unital Banach algebra: $$1\in\mathcal{B}(E):=\{T:E\to E:T\text{ bounded, linear}\}$$

It is important, though, to have an identity operator; that forces the target space to agree with the domain! So though the bounded linear operators between different Banach spaces form a Banach algebra they miss an identity: $$1\notin\mathcal{B}(E,F)$$

Note also that not the Banach space itself is the structure being studied but the algebra of operators acting on the Banach space...

Of course, one could as well consider the bounded linear operators on a normed spaces: $$1\in\mathcal{B}(X):=\{T:X\to X:T\text{ bounded, linear}\}$$ or one could even consider the merely linear operators on a vector space: $$1\in\mathcal{L}(V):=\{T:V\to V:T\text{ linear}\}$$

However, the one lacks of completeness being ridiculously important and the other even misses topological structure at all.

C-star-W-star
  • 16,275
  • 1
    Ok, so if I understand what you are saying: we usually require the spaces to be complete for lack of interesting results when this is not true, not because the definition itself is not meaningful in that case.

    The case of unbounded operators is very special because in that case we don't even have properly an algebra (due e.g. to the domain problems), and certainly we don't have a normed one. In this case we therefore make a definition ad hoc, and again we require completeness to obtain interesting results.

    Is this correct?

    – glS Sep 30 '14 at 10:51
  • @glance: Yes, this is very correct!!!!! – C-star-W-star Sep 30 '14 at 10:55
  • @glance: By the way the definition for the unbounded operators follows some very precise principles I can explain you later if you wish - but unfortunately I gotta go now... – C-star-W-star Sep 30 '14 at 10:58
  • Thank you very much. I also appreciated the note about the requirement of having the target space agreeing with the domain being a consequences of the identity operator used in the definition of the spectrum. One more thing: could you provide one(some) example(s) of crucial results needing the completeness, and therefore lacking in a "only normed spaces" theory? – glS Sep 30 '14 at 10:59
  • @glance: Yes, sure, but later in the evening... – C-star-W-star Sep 30 '14 at 12:11
  • @glance: Basically everything that is related to the C. Neumann series breaks down and that is reeeaaally alot, e.g. "the resolvent is continuous, differentiable, analytic etc.", "the spectrum is nonempty", "the spectrum is closed", "the spectrum is contained in a disk of radius blah" and so on... (One should of course bear in mind that the unbounded operators again constitute a special case for which at least around every second statement still applies.) – C-star-W-star Sep 30 '14 at 16:52
  • @glance: A nice example is the unital but not normable algebra of complex polynomials. There the resolvent set is immediately empty -quite bad!- as soon as the polynomial is nonconstant. – C-star-W-star Sep 30 '14 at 16:52
4

Spectal Theory evolved with the thought in mind of expanding functions in eigenfunctions of some operator. Eigenfunctions and eigenvalues make no sense if you consider $L : X\rightarrow Y$ because $Lf=\lambda f$ makes no sense if $X\ne Y$. You can't have discrete eigenvalues and discrete eigenfunction expansions, or approximate eigenvvalues and continuous (integral) eigenfunction expansions because the statements no longer make sense if $X\ne Y$.

J. Dieudonne in A History of Functional Analysis details how Spectral Theory came out of looking at the ordinary differential equations arising out of Fourier's separation of variables. The separation parameter gave an eigenvalue equation $$ Lf=\lambda f. $$ For finite intervals one had discrete values $\lambda_{n}$ for which non-zero solutions $f_{n}$ existed, and it was found that these eigenfunctions were mutually 'orthogonal' in an integral sense whenever the eigenvalues were different. This led to a general expansion $$ f = \sum_{n} \frac{(f,f_{n})}{(f_{n},f_{n})}f_{n} $$ where the 'inner-product' pairing was $$ (f,g) = \int_{a}^{b}f(t)g(t)\,w(t)\,dt $$ for some weight $w$. Using such expansions greatly simplified the process of solving the original partial differential equations; using such functions amounted to a diagonalization of the operator. (By the way, systematic matrix diagonalization came out of these ODE theories, not the other way around.)

Disintegrating By Parts
  • 87,459
  • 5
  • 65
  • 149
  • Interesting last remark ;) – C-star-W-star Sep 30 '14 at 16:54
  • 1
    @Freeze_S : Yes, it's an odd thing that infinite-dimensional techniques evolved first, including generalized Fourier series in orthogonal functions. It took about 50 years after Sturm-Liouville Theory to come to a general notion of inner-product and orthogonality. – Disintegrating By Parts Sep 30 '14 at 17:00
  • 1
    But somehow it makes sense - you also wouldn't develop group theory in order to count your cows ;) – C-star-W-star Sep 30 '14 at 17:27
  • But you might have thought that the general Cauchy-Schwarz inequality would have come to light before the 1880's, especially after playing with orthogonality, Parseval's equality, general weighted orthogonality conditions and orthogonal function expansions since the early 1800's. But then again, it took 50 years to figure out to put the spout on the bottom of the ketchup bottle. Abstraction is slow. Other times we get surprised by the genius of a few people who stand head and shoulders above the rest. – Disintegrating By Parts Sep 30 '14 at 17:46
  • Hmm yes that's weird... I think understanding the essential points is crucial for the right abstraction (not just the systematic one). So it will always take some time. By the way who did introduce the scalar product - Hilbert? – C-star-W-star Sep 30 '14 at 18:10
  • @Freeze_S : Fourier and Sturm were using the scalar inner-product to isolate Fourier coefficients by orthogonality. Fourier used this extensively and rather generally in his ~1805 work. Mathematicians throughout the 1800's were using the same ideas. Sums of squares and square integrability naturally arose. Bessel's inequality and Parseval's equality were studied by 1830(?). Cauchy(-Schwarz) was known by Cauchy for $\mathbb{R}^{n}$, but not for integrals until Schwarz 1885. Hilbert may deserve most credit for studying functions as points in a space with geometry and similar generalizations. – Disintegrating By Parts Sep 30 '14 at 18:54
  • I seem to remember reading that Bunyakowsky treated the inequality for integrals by the 1850s... whence my preference for "Cauchy-Schwarz-Bunyakowsky", despite its length. Probably we should mention Steklov and Bocher's work in the 1890s, that in J. Lutzen's essay on "Sturm-Liouville theory" is asserted to be the point that all the gaps/issues in basic "Sturm-Liouville theory" were taken care of. – paul garrett Jun 15 '15 at 19:24
  • @paulgarrett The inequality for integrals was treated around 1859 by Buniakowsky, but because of a lack of application, it went unnoticed. Then H. A. Schwarz rediscovered and used the inequality in his studies to solve PDEs through variational methods. The integral inequality was treated by both authors as a generalization of Cauchy's inequality for finite sums, but Schwarz' work was remembered because he used the inequality for something significant. – Disintegrating By Parts Jun 16 '15 at 08:15
  • @TrialAndError, that would account for it, indeed. Do you happen to know, then, what-if-anything Bunyakowsky himself did with the inequality? Or just treated it as a generalization of the sum scenario? – paul garrett Jun 16 '15 at 12:29
  • @paulgarrett : It is my recollection that he did not do anything with the inequality, but my memory is not always reliable. – Disintegrating By Parts Jun 16 '15 at 12:44