There is a known fact of group theory that says: $Z(GL_n)=\lambda I_n$, for all $\lambda\in\mathbb{R}$, being $Z(GL_n)$ the center of the group: the matrices that commute with every matrix of $GL_n$. I will set the guidelines for the demonstration:
Knowing the equivalence of these matrices with vector space endomorphisms, we take the homomorphism $f:\mathbb{R}^n\longrightarrow \mathbb{R}^n$, that has the matrix $A$ as the homomorphism matrix, being $A\in Z(GL_n)$.
We now show that $\lbrace x,f(x)\rbrace$ are linearly dependent: if they weren't, we could build a base of $\mathbb{R}^n$: $B=\lbrace v_1=x,v_2=f(x),v_3,v_4,...,v_n\rbrace$, for $n$ vectors, and there's only one homomorphism $g$ that swaps the first two vectors and keeps the rest the same:
$$G=\left(
\begin{array}{ccccc}
0 & 1 & 0 & \cdots & 0\\
1 & 0 & 0 & \cdots & 0\\
0 & 0 & 1 & \cdots & 0\\
\vdots & \vdots & \vdots & \ddots & \vdots\\
0 & 0 & 0 & \cdots & 1
\end{array}\right)
$$
It must be true that $AG=GA$, so $fg(v_1)=gf(v_1)\Rightarrow f(v_2)=g(v_2)=v_1=x$ which means that $ff(x)=x.$
Now we consider the homomorphism $H$:
$$H=\left(
\begin{array}{ccccc}
1 & 0 & 0 & \cdots & 0\\
1 & 1 & 0 & \cdots & 0\\
0 & 0 & 1 & \cdots & 0\\
\vdots & \vdots & \vdots & \ddots & \vdots\\
0 & 0 & 0 & \cdots & 1
\end{array}\right)
$$
This is also an isomorphism and $AH=HA$:
$$fh(v_1)=f(v_1+v_2)=h(v_2)=v_2\Rightarrow v_2+f(v_2)=v_2\Rightarrow f(v_2)=0$$
We previously had that $x=ff(x)=f(v_2)$, so $x=0$, which is absurd because it was a vector of a minimal base, so our hypothesis was false, and $x, f(x)$ are linearly dependent. This means that $f(x)=\lambda(x)x$, with $\lambda$ a real function, that in the most general way, depends on $x$.
If $x$ and $y$ are linearly independent vectors, then we have that calling $z=x+y$:
$$\lambda(z)x+\lambda(z)y=f(z)=f(x)+f(y)=\lambda(x)x+\lambda(y)y$$
This means that
$$(\lambda(z)-\lambda(x))x+(\lambda(z)-\lambda(y))y=0$$
And $x,y$ are linearly independent, so:
$$\lambda(z)=\lambda(x).$$ So if $x,y$ are linearly independent, $\lambda$ is not a function of the vector, but a constant.
If they're dependent, then we can do a similar trick: if they're dependent, then $y=xa$, for some real $a$:
$$\lambda(y)y=f(y)=af(x)=a\lambda(x)x=\lambda(x)y.$$
If $y\not =0$, then $\lambda$ is constant too.
We have proved that for every vector: $Av=\lambda v$, for every $A\in Z(GL_n)$, so:
$$A=\lambda\left(
\begin{array}{ccccc}
1 & 0 & 0 & \cdots & 0\\
0 & 1 & 0 & \cdots & 0\\
0 & 0 & 1 & \cdots & 0\\
\vdots & \vdots & \vdots & \ddots & \vdots\\
0 & 0 & 0 & \cdots & 1
\end{array}\right)
$$
The converse is obvious: for every matrix of the form $\lambda I$, the matrix belongs to $Z(GL_n)$, then we have that if a matrix commutes with every other matrix, it has the form of the above, and those matrix have a unique eigenvalue: $\lambda$, which is a root of $n$ degree of its characteristic polynomial.