2

Suppose that $v_1,\ldots,v_m\in\mathbb{R}^n$ satisfying $$\sum_{i=1}^m v_iv_i^T=I_n$$ and $$\|v_i\|^2=\frac{n}{m}$$ for each $i$.

Do the vectors $\{v_i\}_{1\leq i\leq m}$ always exist? If not, what's the condition for the existence?

Note that $[v_1,\ldots,v_m]$ can be viewed as a submatrix extracted $n$ rows from an orthogonal matrix, and each columns have the same length.

Nate
  • 715

2 Answers2

2

We have $n = \mathrm{rank}\left(I_n\right) = \mathrm{rank}\left(\sum_{i=1}^m v_iv_i^T\right)\leq m.$ Therefore, we know that $n\leq m.$

We now show that the vectors $v_1,\ldots,v_m$ can be constructed for all $m$ and all $n\leq m.$ If $n$ divides $m,$ there is a trivial way of doing it. So let us assume that $m$ is not a multiple of $n.$

We use induction to show that the $v_i$ can be constructed. Assume that a construction exists for all $m' < m$ and all $n' \leq m'.$ The base case of the induction is trivial. (Set $m=1$ and show that the statement holds.) From this, we construct $V=[v_1,\ldots,v_m]$ as follows:

Let $a=\left\lfloor\frac{m-n}{n}\right\rfloor$ and $b=\left\lceil\frac{m-n}{n}\right\rceil.$ Then $b-a=1,$ because $n$ does not divide $m.$ Let $c=bn-(m-n)$ and $d=(m-n)-an.$ Then $c+d=n$ and $ac+bd=m-n.$

Let $\lambda = \sqrt{\frac{n}{m}}.$ Let $e_1,\ldots,e_n$ be the canonical basis of $\mathbb{R}^n.$

Now we set $v_i = \lambda\,e_1$ for the first $a$ columns of $V,$ $v_i = \lambda\,e_2$ for the next $a$ columns and so on until we have $c$ groups of $a$ identical vectors. Then we set $v_i = \lambda\,e_{c+1}$ for the following $b$ columns of $V,$ $v_i = \lambda\,e_{c+2}$ for the next $b$ columns and so on until we have $d$ more groups of $b$ identical vectors. In the end, we have filled $V$ except for the last $n$ columns. Example: $m=11,\;n=4$. Then $a=1,\,b=2,\,c=1,\,d=3$ and $$ V=\begin{pmatrix} \lambda & & & & & & & \vdots & \vdots & \vdots & \vdots \\ & \lambda & \lambda & & & & & v_8 & v_9 & v_{10} & v_{11} \\ & & & \lambda & \lambda & & & \vdots & \vdots & \vdots & \vdots \\ & & & & & \lambda & \lambda & \vdots & \vdots & \vdots & \vdots \end{pmatrix} $$ Due to our induction assumption, we know that we can construct a $c\times n$ matrix $V_1$ with $V_1V_1^T=I_c$ and all columns of $V_1$ have length $\sqrt{\frac{c}{n}}.$ We extend this matrix to an orthonormal $n\times n$ matrix. Let $V_2$ be the $d\times n$ matrix that forms this extension. We know that $V_2V_2^T=I_d$ and all columns of $V_2$ have length $\sqrt{\frac{d}{n}}.$

Now we set $\mu_1 = \sqrt{\frac{n+d}{m}}$ and $\mu_2 = \sqrt{\frac{n-c}{m}}$ and $$ \left[v_{m-n+1},\ldots,v_m\right] =\begin{pmatrix} & \mu_1 V_1 & \\ & \mu_2 V_2 & \end{pmatrix} $$ It can easily be verified that the resulting matrix $V$ has all required properties.

Reinhard Meier
  • 7,331
  • 10
  • 18
2

As noticed by Reinhard, a necessary and sufficient condition is $m\geq n$. Below you can read a simple existencial proof of this fact.

Consider $B_{m\times m}=\begin{pmatrix}(1-\frac{n}{m})Id_{n\times n} & 0_{n\times (m-n)}\\ 0_{(m-n)\times n} & -\frac{n}{m}Id_{(m-n)\times (m-n)}\end{pmatrix}$.

Notice that $B$ is a traceless real symmetric matrix. Hence there is an orthogonal matrix $U_{m\times m}$ such that $UBU^t=C$ and $C_{ii}=0$ for $i=1,\ldots,m$.

Define $D=\frac{n}{m}Id_{m\times m}+C$.

Notice that $Spec(D)= \stackrel{n}{\overbrace{1,\ldots,1}},\stackrel{m-n}{\overbrace{0,\ldots,0}}$ and $D_{ii}=\frac{n}{m}$ for $i=1,\ldots,m$.

Since $D_{m\times m}$ is a positive semidefinite symmetric matrix of rank $n$, there exists $R_{m\times n}$ such that $D=RR^t$.

Next, $R^tR=Id_{n\times n}$, since $Spec(R^tR)=\stackrel{n}{\overbrace{1,\ldots,1}}$ and $R^tR$ is a real symmetric matrix of order $n$.

Let $R^t=(v_1,\ldots,v_m)$, where $v_i$ is column $i$ of $R^t$.

Then $Id_{n\times n}=R^tR=\sum_{i=1}^mv_iv_i^t$.

Finally, $v_i^tv_i=D_{ii}=\frac{n}{m}$, for $i=1,\ldots,m$. $\square$


Edit: Let us prove that every traceless real symmetric matrix $S_{m\times m}$ is orthogonally equivalent to a symmetric matrix with zeros in the diagonal.

Let $v_1,\ldots,v_m$ be an orthonormal basis of eigenvectors of $S$. Define $v=\frac{v_1+\ldots+v_m}{\sqrt{m}}$.

Notice that $v^tSv=\frac{tr(S)}{m}=0$. Let $U_{m\times m}$ be any orthogonal matrix such that the first column of $U$ is $v$. So $U^tSU_{11}=0$.

Now, since $U^tSU_{11}=0$ and $U^tSU$ is traceless, the submatrix $A_{m-1\times m-1}$ occupying the last $m-1$ rows and columns of $U^tSU$ is also a traceless real symmetric matrix.

By induction on $m$, there is an orthogonal matrix $V_{m-1\times m-1}$ such that $V^tAV$ is symmetric with zeros in the diagonal.

So $W=\begin{pmatrix}1 & 0_{1\times m-1}\\ 0_{m-1\times 1} & V_{m-1\times m-1}\end{pmatrix}$ is orthogonal and $W^tU^tSUW$ is symmetric with zeros in the diagonal.

Finally, $UW$ is orthogonal too.

Daniel
  • 5,872
  • Nice answer. But I think the foundation of this proof (each square traceless matrix is unitarily similar to a zero-diagonal matrix) is not necessarily a well-known fact. Do you have a reference for that? I have only found this on stackexchange. – Reinhard Meier Mar 23 '21 at 10:43
  • @ReinhardMeier Hi Reinhard. I edited my answer, according to your question. Regards. – Daniel Mar 23 '21 at 16:34