Problem Definition
I would like to code an algorithm for decomposing a covariance matrix into its eigensolution (set of eigenvalues and corresponding eigenvectors. In my specific case I want to deal only with $3\times 3$ covariance matrices.
It is my understanding that symmetric and positive semi-definite matrices decompose into positive real roots. This means I'm trying to solve a cubic polynomial and skip the cases of complex roots. This should also mean my resulting eigenvectors are orthogonal.
I am trying to understand if I need to consider the cases where I do not end up with $3$ unique real roots. I plan to use this algorithm on a data set containing points in $3D$ Euclidean space and am unsure if there exist further simplifications in solving for the eigenvalues.
Main Question
What exactly would it mean (geometrically) if I have one, two or three distinct eigenvalues in terms of a $3D$ pointset in Euclidean space? This is where my understanding fails.
Source of my Confusion
My main reference was in the book Graphics Gems IV, page $193$ by Cromwell. In this section of the book Cromwell describes an optimized method for computing the roots of this particular cubic polynomial, but the text confuses me as it doesn't seem to care about the cases without $3$ distinct roots. The problem definition in the Cromwell section is the exact same as mine: decomposing a covariance matrix from a $3D$ pointset. Other online sources seem to be more general (not specific to $3\times 3$ covariance matrices) and consider all positive-real cases (see Eberly).