4

Exercise 4 after $\S$ 91 from Paul R. Halmos's Finite-Dimensional Vector Spaces (second edition) invites to prove or disprove the following assertion.

If $A$ is a linear transformation (operator) on a finite-dimensional unitary space, then a necessary and sufficient condition that $A^n \rightarrow \mathbb 0$ is that all the proper values (Eigenvalues) of $A$ be (strictly) less than $1$ in absolute value.

For reference, $\S 91$ of the book identifies the symbolic expression $\Vert A_n - A \Vert \rightarrow \mathbb 0$ with the verbal assertion that "a sequence $(A_n)$ of linear transformations converges to a fixed linear transformation $A$". Also, $\S 88$ of the book defines the norm $\Vert \cdot \Vert$ of a linear transformation by writing $\Vert A \Vert = \sup\limits_{x \in \mathcal V}\big\{\Vert Ax \Vert : \Vert x \Vert = 1\big\}$ where $A$ is defined on an inner product space $\mathcal V$.


I am able to prove the necessity (hopefully correctly), but stalled with the sufficiency and would appreciate help.

Proof for the necessity: Let $\mathcal V$ be the unitary space in question, let $m$ be the dimension of $\mathcal V$, and let $\lambda_1, \cdots, \lambda_m$ be the Eigenvalues (not necessarily distinct) of $A$. It is clear that from a consideration of the triangular form of $A$ w.r.t. a suitable basis in $\mathcal V$ that $\lambda_1^n, \cdots, \lambda_m^n$ are the Eigenvalues (not necessarily distinct) of $A^n$. If $A^n \rightarrow \mathbb 0$ (operator), then we have $\left\Vert A^n \right\Vert \rightarrow 0$ (scalar). Because the magnitude of an Eigenvalue of a linear transformation $B$ is less than or equal to the norm of $B$, we see that $\left\vert \lambda_i^n \right\vert \leq \left\Vert A^n \right\Vert$ for $i = 1, \cdots, m$. It follows that $\left\vert \lambda_i^n \right\vert \rightarrow 0$ which leads to that $\left\vert \lambda_i \right\vert \leq 1$.

Unsuccessful attempt for the sufficiency: In addition to the hypothesis of the above proof, let $\left\vert \lambda_i \right\vert \leq 1$ for $i = 1, \cdots, m$. A consideration of the triangular form of $A$ w.r.t. a suitable basis in $\mathcal V$ reveals that $A^n \rightarrow N$ where each Eigenvalue of $N$ is zero. It is clear that $N$ is nilpotent.

K.Power
  • 6,561
  • 1
  • 17
  • 40
  • 2
    Halmos wrote "strictly less than $1$". You cannot change the condition $|\lambda|<1$ to $|\lambda|\le1$. This should be obvious: if $Av=\lambda v\ne0$ with $|\lambda|=1$, then $|A^kv|=|v|$ for every positive integer $k$; therefore $A^k$ does not converge to zero and so doesn't $A^k$. – user1551 Nov 16 '21 at 17:50
  • 1
    The statement in the block quote is true. For sufficiency, see my answer to another question. – user1551 Nov 16 '21 at 20:27
  • @user1551 Yes, it was a mistake to write $\vert \lambda \vert \leq 1$ in the title of the post. Thanks for catching. – AMathStudent Nov 17 '21 at 00:44

1 Answers1

0

EDIT: It has been pointed out to me that you are interested in complex operators over complex inner product spaces (I missed the adjective unitary), where we consider the full complex spectrum. In this case you can use Schur decomposition as done in this answer to get your sufficient condition.

I shall keep my counterexample for real spaces and spectrums though, as I think it is informative as to why we need to look at the full complex spectrum. Without further restrictions on $A$ this condition is not sufficient for real vector spaces where we only care about real eigenvalues. Set $$A=\begin{pmatrix}\cos \theta&-\sin\theta & 0 \\ \sin\theta & \cos\theta & 0\\ 0 & 0 & 1/2\end{pmatrix},$$ where $\theta$ is some small irrational angle. All this transformation does is rotate vectors around the $z$ axis by $\theta$ and shrink their $z$ component by $1/2$. This means that the only real eigenvalue of $A$ is $\lambda=1/2<1$, with corresponding normalised eigenvector $v=\begin{pmatrix}0 & 0 & 1\end{pmatrix}^T$. However, for any $n\in \mathbb N$ we have $$A^n=\begin{pmatrix}\cos n\theta&-\sin n\theta & 0 \\ \sin n\theta & \cos n\theta & 0\\ 0 & 0 & 1/2^n\end{pmatrix},$$ which means that $\|A^n\|\geq 1$ for all $n$, because for any nonzero vector $u=(x,y,0)^T$ we have that $\|A^nu\|=\|u\|.$ Thus $A^n$ does not converge to $0$, even though $A$'s only real eigenvalue lives strictly in the unit circle.

This isn't a counterexample to the general statement if we consider all complex eigenvalues of $A$, because $e^{\pm i\theta}$ are complex eigenvalues of $A$ living on the boundary of the unit circle, and hence there is no contradiction.

K.Power
  • 6,561
  • 1
  • 17
  • 40
  • 1
    The vector space in the problem statement is a unitary space, that is, a complex inner product space. Your counterexample doesn't work in this setting. – user1551 Nov 16 '21 at 18:27
  • 1
    @user1551 Thank you! I did not spot this. I shall keep my answer because I quite like it as an example showing why we need to consider the full complex spectrum and not just its real parts. I have edited the post accordingly, and linked to your answer on the other post. – K.Power Nov 17 '21 at 00:28