$$
A = \left[\begin{array}\
a_1^T\\
a_2^T\\
\vdots \\
a_n^T
\end{array}\right]
$$
$$
B = \left[\begin{array}\
b_1&
b_2&
\cdots &
b_n
\end{array}\right]
$$
$$
AB = \left[\begin{array}\
a_1^Tb_1 & a_1^Tb_2 & \cdots \\
&\\
&\ddots \\
a_n^Tb_1 & a_n^Tb_2 & \cdots
\end{array}\right]\qquad(1)
$$
So in your $V^TV$, the diagonal $a_i^Tb_i$ are ones and remaining elements are zeros. Thus forming an identity matrix.
$AB = I$ and so $B^TA^T = I$.
Now consider,
$$A^TB^T = \left[\begin{array}\
a_1&
a_2&
\cdots &
a_n
\end{array}\right]
\left[\begin{array}\
b_1^T\\
b_2^T\\
\vdots \\
b_n^T
\end{array}\right]
$$
$$
A^TB^T=\left[\begin{array}\
a_1b_1^T + a_2b_2^T+ \cdots + a_nb_n^T
\end{array}\right] \qquad(2)
$$
EDIT
So the term $\sum_{i=1}^nv_iv_i^T$ is the term in (2). So now if we prove that for $B^TA^T = I$, left inverse is same as right inverse, i.e. $A^TB^T = I$, our work is done.
For this proof, check out If $AB = I$ then $BA = I$
Now I know that this is not what you asked. You wanted to understand it in terms of orthogonality of the eigenvectors. For this I have tried to summate $\sum_{i=1}^nv_iv_i^T$ for different matrices and found that for orthonormal columns, the rows are also orthonormal. I have checked so many sources for its proof, but every proof is just utilising the $Q^TQ = Q Q^T = I$, which is what you wanted the proof of in the first place. Arithmetically everything is somehow falling into correct places to make it identity but I'm just unable to formulate it even after I tried with many examples.
So I think my above explanation is arithmetically correct enough for the proof, but in order to utilise the orthonormality of the eigenvectors (i.e. column space of $V$), I have imagined it in the following way:
We know that $v_iv_i^T$ is a projection matrix which projects on to the column space of this projection matrix, which in turn is essentially along the column vector itself since it is a rank-1 matrix.
Now $v_1v_1^T$ will project an n-dimensional vector $x$ along $v_1$, similarly $v_2v_2^T$ along $v_2$ and so on. So when you add all these n projections, you end up with the original vector $x$, thus in a way, sum of all these projection matrices, project the vector onto itself, so it must be identity matrix.
EDIT2:
Extending this visualization of projections to equations based on link provided by Amrit in the comments:
So the n-dimensional vector $x$ can be written as sum of all the projections, with each projection length as $c_i$, we get:
$$
x = c_1v_1+c_2v_2+\cdots+c_nv_n
$$
To find each $c_i$, we dot product $x$ with $v_i$
$$
x\cdot v_i = (c_1v_1)\cdot v_i+\cdots + (c_iv_i)\cdot v_i + \cdots + (c_nv_n)\cdot v_i
$$
$$
x\cdot v_i = 0+\cdots +c_i+\cdots +0
$$
So rewriting $x$ in terms of these calculated coefficients,
$$
x = (x\cdot v_1)v_1+\cdots +(x\cdot v_n)v_n
$$
$$
x = \sum_i(v_i^Tx)v_i
$$
$$
\Bigl(\sum_iv_iv_i^T-I_n\Bigr)x=0~\forall x \implies \sum_i v_iv_i^T=I_n
$$