0

Consider the matrix $A \in \mathbb{R}^{n \times n}$ of all ones. Because there is only 1 linearly independent column, there are $n-1$ zero eigenvalues and 1 non-zero eigenvalue which is $n$.

So one eigenvector, $u_1$ can be determined by inspection of the definition of eigenvector: \begin{align*} Au_1 &= nu_1 \\ \therefore\qquad u_1 &= 1_n \end{align*}

Since $A$ is symmetric that means the eigenvectors corresponding to distinct eigenvalues are orthogonal. So that means all the eigenvectors ($n-1$ of them) corresponding to an eigenvalue of zero are orthogonal to $1_n$ (this implies this must all have zero mean).

My question is, how do we succinctly represent these $n-1$ orthogonal vectors? I also know that all of these $n-1$ eigenvectors are linearly independent. I just don't know how to properly represent them succinctly.

I think we can pick

\begin{align*} u_2 &= \begin{bmatrix}1 & -1 & 0 & \dots & 0\end{bmatrix}^T\\ u_3 &= \begin{bmatrix}1 & 0 & -1 & 0 & \dots & 0\end{bmatrix}^T\\ &\ \ \vdots\\ u_n &= \begin{bmatrix}1 & 0 & \dots & 0 & -1\end{bmatrix}^T \end{align*}

all of these are linearly independent. I just don't know how to represent them the formal way.

Christoph
  • 24,912

2 Answers2

2

First, the case where $n$ is odd, with $n = 2k + 1$: Let $\theta = 2 \pi /n$. Taking the real and imaginary parts of the columns of the DFT matrix gives us the following nice orthogonal basis for $u_1^\perp$: \begin{align*} c_1 &= [1\ \ \cos \theta \ \ \cdots \ \ \cos ((n-1)\theta)]\\ c_2 &= [1\ \ \cos (2\theta) \ \ \cdots \ \ \cos (2(n-1)\theta)]\\ &\ \ \vdots \\ c_{k} &= [1\ \ \cos (k\theta) \ \ \cdots \ \ \cos (k(n-1)\theta)]\\ s_1 &= [1\ \ \sin \theta \ \ \cdots \ \ \sin ((n-1)\theta)]\\ s_2 &= [1\ \ \sin (2\theta) \ \ \cdots \ \ \sin (2(n-1)\theta)]\\ &\ \ \vdots \\ s_{k} &= [1\ \ \sin (k\theta) \ \ \cdots \ \ \sin (k(n-1)\theta)].\\ \end{align*} In the case that $n$ is even, we do essentially the same thing, but also include the vector $[-1,1,-1,\dots,1].$

Christoph
  • 24,912
Ben Grossmann
  • 225,327
  • Ah interesting. Is there some implicit requirement that the eigenvectors should be represented as orthogonal / orthonormal basis? What is the motivation in doing that? – roulette01 Aug 13 '20 at 16:53
  • I thought as an alternative to the very simple Hadamard basis, but alas possible for rather specific values of dimension $n$. – Jean Marie Aug 13 '20 at 16:55
  • Well, doing this is always possible as a consequence of the spectral theorem. The end result of finding this orthonormal eigenbasis is that we end up with an "orthogonal diagonalization" $A = UDU^T$, where $U$ is the orthogonal matrix whose columns are the orthonormal eigenbasis. Just as diagonalizing $A$ allows us to easily find $p(A)$ for polynomials and analytic functions $p$, an orthogonal diagonalization allows us to do the same thing with functions of $A$ and $A^T$. – Ben Grossmann Aug 13 '20 at 16:56
1

The other eigenvectors are found from $$Ax = 0$$

Solving this equation, we see that they span a plane through the origin:

$$x_1 + x_2 + \cdots + x_n = 0$$

Here is one example of a set of eigenvectors:

$$\left\{ \pmatrix{\phantom{-}1\\-1\\\phantom{-}0 \\\phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}0 \\ \phantom{-}0\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\ \phantom{-}1\\-1\\\phantom{-}0 \\ \phantom{-}\vdots\\ \phantom{-}0 \\ \phantom{-}0\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\\phantom{-}0 \\ \phantom{-}1\\-1\\ \phantom{-}\vdots\\ \phantom{-}0 \\\phantom{-}0\\ \phantom{-}0}, \cdots, \pmatrix{\phantom{-}0 \\\phantom{-}0 \\ \phantom{-}0\\\phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}1 \\- 1\\\phantom{-}0}, \pmatrix{\phantom{-}0 \\ \phantom{-}0 \\ \phantom{-}0\\ \phantom{-}0\\ \phantom{-}\vdots\\ \phantom{-}0\\ \phantom{-}1 \\ -1} \right\}$$

It is not difficult to find an orthonormal set. One way, we can apply the Gramm-Schmidt process.

mjw
  • 8,647
  • 1
  • 8
  • 23
  • Yes, but they do not yet constitute an orthonormal basis... – Jean Marie Aug 13 '20 at 16:50
  • @JeanMarie Do they need to constitute an orthonormal basis? – roulette01 Aug 13 '20 at 16:52
  • It looks to be one of the demands of the text (not in the final sentences). – Jean Marie Aug 13 '20 at 16:53
  • Okay, I missed that in the problem statement. Nice point! ... One way is to take these vectors and apply the Gramm Schmidt process. Actually, will think about it some more. Probably not difficult to for us to come up with something. $(1,-1,0,0,...,0)^T$, $(1,-1,1,-1,0,...,0)^T$, etc. – mjw Aug 13 '20 at 16:54
  • 1
    Even the questioner didn't realize we are asked for an orthogonormal set of vectors! – mjw Aug 13 '20 at 16:55