3

I am trying to see why this is true. A book I am reading has this claim without any verification and I'm trying to see why it is true.

Let $G$ be an $n\times n$ matrix all of whose eigenvalues have nonzero real part. Let $x\in \mathbb R^n \setminus \{0\}$, then the function $t \mapsto \lvert \exp(tG)x \rvert$ is unbounded for $t\in \mathbb R$. (Note: $\exp(tG)$ is the matrix exponential)

Here is what I thought: Well, since $G$ has all its eigenvalues with nonzero real part it follows that I can write $G$ in a Jordan form (with obvious rearranging) so that

$$ G= \begin{pmatrix} G_s & 0\\ 0 & G_u \end{pmatrix} $$

Where $G_s$ is a matrix all of whose eigenvalues have negative real part and $G_u$ is a matrix all of whose eigenvalues have positive real part. Then using properties of matrix exponentials I tried to obtain a lower bound of the form $Ce^t|x|$ for $\lvert \exp(tA)x \rvert$ so that as $t\to \infty$ we have the required unboundedness.

'Intuitively' the positive real parts of eigenvalues of $G_u$ should make this map unbounded (but my intuition may be wrong), but I can't seem to be able to write this precisely. Am I even on the right track?. Can anyone give some pointers as to what I should do?

Cousin
  • 3,525

2 Answers2

3

You can find $P$ so that $PGP^{-1}=T=\begin{pmatrix}\lambda_1 & * & *\\&\ddots & *\\&&\lambda_n\end{pmatrix}$ where the $\lambda_i$s are the eigenvalues and $*$ means "whatever".

Then, $\exp(tG)=\exp(tP^{-1}TP)=P^{-1}\exp(tT)P$.

$\exp(tT)=\begin{pmatrix}e^{t\lambda_1} & * & *\\&\ddots & *\\&&e^{t\lambda_n}\end{pmatrix}$

$\exp(tT)x=\begin{pmatrix}e^{t\lambda_1}x_1+\dots+* x_n\\\vdots\\ e^{t\lambda_n}x_n\end{pmatrix}$

Now, take $k$ as big as possible so that $x_k\not= 0$. You get $\exp(tT)x=\begin{pmatrix}e^{t\lambda_1}x_1+\dots+* x_k\\\vdots\\ e^{t\lambda_k}x_k\\0\\\vdots\\ 0\end{pmatrix}$ which is clearly unbounded (with $t\to\pm\infty$ depending on the sign of $\lambda_k$).

xavierm02
  • 7,495
  • Well $\exp(tG)=\exp(P^{-1}(tT)P)$.So you're saying since $T$ is upper triangular it should be easy to find the matrix exponential of $\exp(tT)$. But I only know how to 'easily' exponentiate a diagonal matrix. Is there some trick involved in exponentiating a triangular matrix?. :) – Cousin Aug 30 '14 at 18:13
  • Compute $T^2$, $T^3$. You should see a pattern on diagonal elements. And unless I'm mistaken, those are the only ones you need to compute. – xavierm02 Aug 30 '14 at 18:14
  • Well $T^k$ is upper triangular of the form $\lambda_i^k$ on the diagonal. But then how does that help with taking the norm of $\exp(tT)x$?. Can you please clarify? – Cousin Aug 30 '14 at 18:32
  • @JackDawkins : Done. – xavierm02 Aug 30 '14 at 18:46
1

Nota Bene: In my haste and arrogance I misread the question as written, blithely and blindly assuming that the only case of interest was $t \to \infty$; when my error was graciously pointed out by our colleague PhoemueX (see comments), I deleted my answer until such time as I could correct it. Having done so, I present my rectified answer below; however, I have included my original answer as an appendix in penance for my sin of pride; may our Lady the Queen of Sciences, Mathematics, have mercy on my soul!

"Let $G$ be an $n\times n$ matrix all of whose eigenvalues have nonzero real part. Let $x\in \mathbb R^n \setminus \{0\}$, then the funtion $t \mapsto \lvert \exp(tG)x \rvert$ is unbounded for $t\in \mathbb R$. (Note: $\exp(tG)$ is the matrix exponential)";

To see this, one merely need look at $e^{tB}$ for a Jordan block $B$ corresponding an eigenvalue $\lambda$. Since

$B = \lambda I_m + N_m, \tag{1}$

where $I_m$ is the $m \times m$ identity matrix and $N_m = [n_{ij}]$, $ \le i, j \le m$ is the $m \times m$ matrix with

$n_{ij} = 0, \; \; j \ne i + 1, \tag{2}$

$n_{ij} = 1 \; \; j = i + 1, \tag{3}$

we have, since $[\lambda I_m, N_m] = 0$,

$e^{tB} = e^{t\lambda I_m + tN_m} = e^{t\lambda I_m} e^{tN_m} = e^{t\lambda} e^{tN_m}, \tag{4}$

and thus

$\vert e^{tB}x \vert = \vert e^{\lambda t} e^{tN_m}x \vert = \vert e^{(\Re(\lambda) + i\Im(\lambda)) t} e^{tN_m}x \vert = \vert e^{(\Re(\lambda) + i\Im(\lambda)) t} \vert \vert e^{tN_m}x \vert$ $\vert e^{\Re(\lambda)t} \vert \vert e^{i\Im(\lambda) t} \vert \vert e^{tN_m}x \vert = e^{t \Re(\lambda)} \vert e^{tN_m} x \vert, \tag{5}$

since $\vert e^{i\Im(\lambda) t} \vert = 1$ and $\vert e^{t \Re(\lambda)} \vert$ is positive real for all values of $\Re(\lambda)t$. The components of the vector $e^{tN_m}x$ are polynomials in $t$; this follows from the fact that $N_m$ is nilpotent; indeed, $N_m^m = 0$; thus the entries of $e^{tN_m}$ are themselves polynomials in $t$, as are the components of $e^{tN_m}x$; thus $\vert e^{tN_m}x\vert^2$ is a polynomial in $t$ (and here I am assuming $\vert y \vert^2 = \langle y, y \rangle$ for $y \in \Bbb C^n$, i.e. that $\vert \cdot \vert$ is an ordinary Hermitian norm ($L^2$ norm on $\Bbb C^n$ for those who like the lingo of functional analysis)); furthermore we note that $e^{N_m t} x \ne 0$ for $x \ne 0$ by virtue of the fact that $e^{-N_m t}$ is the inverse of $e^{N_m t}$; thus $\vert e^{N_m t}x \vert^2$ is in fact a non-vanishing polynomial and as such $\lim_{t \to \pm \infty} \vert e^{N_m t}x \vert = c > 0$ a constant, where we allow $c$ to take to the value $+\infty$. In the light of these observations, and as a consequence of (5) we have that

$\vert e^{Bt}x \vert = e^{\Re(\lambda)t} \vert e^{N_m t} \vert \to \infty \;\; \text{as} \; \; t \to \text{sign}(\Re(\lambda)) \infty, \tag{6}$

thus $\vert e^{Bt} x \vert$ is unbounded when $\Re(\lambda) \ne 0$.

Since (6) binds for any Jordan block, $\vert e^{Ft} x \vert$ is unbounded as $t \to \pm \infty$ for any matrix $F$ in Jordan form, provided the eigenvalues $\mu$ of $F$ satisfy $\Re(\mu) \ne 0$. We now invoke the fact that our $G$ is similar to such an $F$, $G = SFS^{-1}$ for some nonsingular $S$, provided all the eigenvalues $\lambda$ of $G$ satisfy $\Re (\lambda) \ne 0$; thus we may conclude that $\vert e^{Gt}x \vert$ is unbounded as $t$ ranges over $\Bbb R$.

Note: the fact that $e^{Bt} = e^{\lambda I_m t}e^{N_m t}$ follows from $[\lambda I_m, N_m] = 0$ is very well-known; for details, see my answer to this question.

QED.

Appendix: Orignial Answer.

I hate to be the bearer of evil tidings, but anyone who tries to prove the assertion

"Let $G$ be an $n\times n$ matrix all of whose eigenvalues have nonzero real part. Let $x\in \mathbb R^n \setminus \{0\}$, then the funtion $t \mapsto \lvert \exp(tG)x \rvert$ is unbounded for $t\in \mathbb R$. (Note: $\exp(tG)$ is the matrix exponential)"

will have a very rough go of it, because it is false.

Counterexample: take

$G = - I_n, \tag{1}$

where $I_n$ is the $n \times n$ identity matrix. Then for all $x \in \Bbb R^n$,

$\lim_{t \to \infty} \vert e^{tG}x\vert = \lim_{t \to \infty} \vert e^{-tI_n}x \vert = \lim_{t \to \infty} e^{-t} \vert x \vert = 0. \tag{2}$

(2) may be generalized to any $G$ such that $\Re(\lambda) < 0$ for all eigenvalues $\lambda$ of $G$; to see this, one merely need look at $e^{tB}$ for a Jordan block $B$ corresponding to such $\lambda$. Since

$B = \lambda I_m + N_m, \tag{3}$

where $N_m = [n_{ij}]$, $ \le i, j \le m$ is the $m \times m$ matrix with

$n_{ij} = 0, \; \; j \ne i + 1, \tag{4}$

$n_{ij} = 1 \; \; j = i + 1, \tag{5}$

we have, since $[\lambda I_m, N_m] = 0$,

$e^{tB} = e^{t\lambda I_m + tN_m} = e^{t\lambda I_m} e^{tN_m} = e^{t\lambda} e^{tN_m}, \tag{6}$

and thus

$\lim_{t \to \infty} \vert e^{tB}x \vert = \lim_{t \to \infty} \vert e^{\lambda t} e^{tN_m}x \vert = \lim_{t \to \infty} e^{t \Re(\lambda)} \vert e^{tN_m} x \vert = 0, \tag{7}$

since the components of the vector $e^{tN_m}x$ are polynomials in $t$; this follows from the fact that $N_m$ is nilpotent; indeed, $N_m^m = 0$; thus the entries of $e^{tN_m}$ are themselves polynomials in $t$, as are the components of $e^{tN_m}x$; thus $\vert e^{tN_m}x\vert^2$ is a polynomial in $t$ (and here I am assuming $\vert y \vert^2 = \langle y, y \rangle$ for $y \in \Bbb R^n$, i.e. that $\vert \cdot \vert$ is an ordinary Euclidean norm ($L^2$ norm on $\Bbb R^n$ for those who like the lingo of functional analysis)); hence for $\Re(\lambda) < 0$, $e^{2\Re(\lambda)t}\vert e^{tN_m}x \vert^2 \to 0$ as $t \to \infty$ since exponentials dominate polynomials as far as behavior at large $t$ is concerned; but then $e^{\Re(\lambda)t}\vert e^{tN_m}x \vert \to 0$ for large $t$ as well. So it follows that (7) binds for any Jordan block; thus $\vert e^{Ft} x \vert \to 0$ as $t \to \infty$ for any matrix $F$ in Jordan form, provided the eigenvalues $\mu$ of $F$ satisfy $\Re(\mu) < 0$. We now invoke the fact that our $G$ is similar to such an $F$ provided all the eigenvalues $\lambda$ of $G$ satisfy $\Re (\lambda) < 0$; thus we may conclude that $\lim_{t \to \infty} \vert e^{Gt}X \vert \to 0$ if $\Re(\lambda) < 0$, $\lambda$ an eigenvalue of $G$.

Note: the fact that $e^{Bt} = e^{\lambda I_m t}e^{N_m t}$ follows from $[\lambda I_m, N_m] = 0$ is very well-known; for details, see ny answer to this question.

Of course, if $G$ has eigenvalues $\lambda$ with $\Re(\lambda) > 0$, then there will exist $x$ with $\lim_{t \to \infty} \vert e^{Gt} x \vert \to \infty$; this may be demonstrated by arguments similar to the above. But the assertion in the question i not valid as stated.

End: Original Answer.

Hope this helps. Cheers,

and as always,

Fiat Lux!!!

Robert Lewis
  • 71,180