I am trying to use least squares to solve a problem of the form
$u = - K v$
where u and v are vectors of size 3, and K is a 3X3 matrix. Where I want to estimate K, given u and v. I have multiple data for u and v, setup with the hope that the different data points are sufficiently linearly independent that a unique solution exists.
The way I have proceeded to do this is setup the problem as a traditional least-squares problem
$Ax = b$,
where b is now all the data I have for the 3 elements of u, and A has all the data for v, and K has been expanded into a vector x. Where x = (K11, K12, K13, K21, ... K33).
Due to some physical reasons I want to add an extra constrain. The constrain is that if I break the matrix K into a symmetric and antisymmetric part,
$K = S + A$,
then the eigenvalues of S should be positive, or atleast 2 of them be positive. In particular, S will have 3 eigenvalues, ($\lambda_1, \lambda_2,\lambda_3 $). In my problem $\lambda_1, \lambda_2>> \lambda _3$, and I would like $\lambda_1, \lambda_2 > 0$.
The physical reason is that $\lambda_1, \lambda_2$ are representative of diffusivities, and negative diffusivities are usually unjustifiable on physical grounds.
In my current solution, where I apply no constrains, the eigenvalues end up being positive almost everywhere (>80% times). I think the places where the values are turning out to be negative are regions where the data in u and v (or A and b) is not sufficiently linearly independent (any help on a good metric to check this would also be appreciated).