2

I want to calculate the norm of the matrix $$A = \left(\begin{array}{cc} 1&1 \\ 0&1\end{array}\right).$$ The norm is $$\Vert A \Vert_2 = \sup_{\Vert v \Vert = 1}\Vert Av \Vert.$$ I can show that $\Vert A \Vert_2$ is largest singular value of $A$ and so, is easy to find the norm. But I would like to know how to calculate explicitly, only using the definition of $\Vert A \Vert_2$.

If all the eigenvectors are genuine, so we can take an orthonormal basis (by Gram-Schmidt) of eigenvectors and write $$\Vert Av \Vert = \Vert A(c_1e_1 + c_2e_2) \Vert = \Vert c_1Ae_1 + c_2Ae_2\Vert = \Vert c_1\lambda_1e_1 + c_2\lambda_2e_2\Vert.$$ But $\Vert v \Vert = 1$ implies $$1 = \Vert c_1e_1 + c_2e_2 \Vert^2 = c_1^2\Vert e_1 \Vert^2 + c_2^2\Vert e_2 \Vert^2 = c_1^2 + c_2^2,$$ since $e_1$ and $e_2$ are orthogonal.

If we have only one genuine eigenvector (that is the case here), I think we can to use similar arguments with some power of $n$.

Anyway, I could not go any further. I appreciate any help!

Edit. By the comments below, my approach seems not work.

Greg
  • 482
  • 1
    "If all the eigenvectors are genuine, so we can take an orthonormal basis (by Gram-Schmidt) of eigenvectors" Careful! Even if a matrix is not defective, it may have eigenvectors that are not orthogonal, and if you orthogonalize then they may not remain eigenvectors. –  Nov 06 '19 at 13:15
  • 3
    For a $2 \times 2$ matrix $A = \begin{pmatrix} a & b \ c & d \end{pmatrix}$ you can maximize $(av_1 + bv_2)^2 + (cv_1 + dv_2)^2$ under the constraint $v_1^2 + v_2^2 = 1$ using the Lagrange multiplier method. It is sort of tedious, though. – Klaus Nov 06 '19 at 13:18
  • @Rahul thanks for the point out the error! – Greg Nov 06 '19 at 13:20

2 Answers2

3

A possible way :

We parametrize the set of unit vectors for $\| \cdot\|$ of $\mathbb{R}^2$ by $t \mapsto (\cos t, \sin t), t \in [0,2\pi]$.

Hence :

$$\|A\|_2^2 = \max_{t \in [0,2\pi]} (\cos t + \sin t)^2+\sin^2 t = \max_{t \in [0,2\pi]} \sin 2t+\sin^2 t.$$

The problem is then reduced to find the maximum of a one variable function over $[0,2\pi]$. It requires some computation in this case though.

nicomezi
  • 8,254
1

It is known that $$\Vert A^{*}A \Vert= \Vert A \Vert^{2}.$$ We have $$ A^{*}A = \begin{bmatrix} 1 & 1 \\ 1 & 2 \end{bmatrix},$$ but $A^{*}A$ is self-adjoint then $\Vert A^{*}A \Vert$ equals to the modulus of the largest of its eigenvalues, that is $\dfrac{3+\sqrt{5}}{2}$.

A. Bag
  • 141
  • In general if A= $\begin{bmatrix} a & b \ c & d \end{bmatrix}$, then $$\Vert A\Vert^{2}=\dfrac{1}{2}(\alpha +\sqrt{\alpha^{2}-4\delta})$$ where $ \alpha=\vert a\vert+\vert b\vert+\vert c \vert+\vert d\vert $ and $\delta =det(A^{*}A)$ – A. Bag Apr 29 '21 at 01:38