Matrices can be viewed as linear maps on vector spaces...in fact if working over a nice field such as the field of real numbers, matrices give geometrical transformations.
For example the reflection in the line $y=x$ in $\mathbb{R}^2$ can be simulated by multiplication by the matrix:
$\left(\begin{array}[aa]\\
0 1 \\
1 0\end{array}\right)$
Now it is clear that geometrically there are certain symmetries here. For example if you choose any point on the line $y=x$ it gets sent to itself and if you choose any point on the line $y=-x$ then it gets sent to a point in the complete opposite direction from the origin.
This information is essentially what the eigenvalues and eigenvectors of the above matrix captures. The eigenvectors are the vectors on those two lines and the eigenvalues are the corresponding scalar multiple of the vector you get after applying the reflection.
Things on the line $y=x$ got sent to themselves, i.e. $Av = v$, a scalar multiple of $1$.
Things on the line $y=-x$ got sent to the negative of themselves, i.e. $Av = -v$, a scalar multiple of $-1$.
Thus we expect two eigenvalues $\pm 1$ and two "eigenspaces", $V_1, V_{-1}$ consisting of all vectors of eigenvalues $1$ and $-1$ respectively.
These spaces are exactly the vectors lying on the lines $y=x$ and $y=-x$ respectively.
Of course there are ways to work out these things using only the matrix but hopefully you can see a bit of significance to them now. They come in useful in many areas of maths.
http://blog.stata.com/2011/03/09/understanding-matrices-intuitively-part-2/
http://mathoverflow.net/questions/31838/intuitions-connections-examples-for-eigen
And there is a really interesting gif on that wiki site, if you are a visual type.
– ante.ceperic Feb 13 '13 at 09:53