If $n=1$, there's nothing to prove, as all matrices are scalar multiples of the identity.
If $n\ge 2$, take two linearly independent vectors $v_1,v_2$ with corresponding eigenvalues $\lambda_1$, $\lambda_2$. Since every vector is an eigenvector, so is $v_1-v_2$ with corresponding eigenvalue $\lambda_3$. So then $A(v_1-v_2) = \lambda_3 (v_1 -v_2) = \lambda_3v_1 - \lambda_3v_2$.
Distributing $A$ however gives us $A(v_1-v_2) = Av_1-Av_2 = \lambda_1v_1 - \lambda_2v_2$, and so $\lambda_1v_1 - \lambda_2v_2 = \lambda_3v_1 - \lambda_3v_2$. Combining like terms on both sides, we see $(\lambda_1-\lambda_3)v_1 + (\lambda_3-\lambda_2) v_2 = 0$, so then as $v_1$ and $v_2$ are linearly independent, $\lambda_1-\lambda_3 = 0$ and $\lambda_3-\lambda_2 = 0$. Thus $\lambda_1=\lambda_2=\lambda_3$.
Since this works for any two vectors, namely it works for any pair of basis vectors. So letting $\{ v_1,\cdots, v_n\}$ be a basis for $\mathbb{R}^n$, we see that the above shows that every basis vector is sent to a scalar multiple of itself, and all those scalars are the same, say $\lambda$. Thus for any $v \in \mathbb{R}^n$, we can write $v = a_1v_1+\cdots +a_nv_n$ where each $a_i \in \mathbb{R}$, and: $$Av = A\displaystyle\sum_{i=1}^n a_iv_i = \sum_{i=1}^n a_iA(v_i) = \sum_{i=1}^n a_i\lambda v_i = \lambda\sum_{i=1}^n a_iv_i = \lambda v.$$ Since this holds for any $v \in V$, we have $A = \lambda I$.
Notably, we used nothing about the fact that we were using a matrix, real or not, and any property of $\mathbb{R}^n$. This fact actually holds for any linear operator on a finite dimensional vector space satisfying this property. It may work in infinite dimensions too, but I haven't done much infinite dimensional linear algebra.