I need to implement an algorithm that calculates the symmetric matrix obtained by performing $A A^t$ being $A^t$ the transpose of $A$.
I did my analysis from two perspectives:
The first thing I notice is that it is not necessary to have more memory to have $A^t$ because it is the same as $A$, what changes is the "rotation". Therefore I not need store $A^t$ whether it can be "creative" and "intelligent" to iterate over the row / column properly in $A$.
The 2nd, what is known is that the symmetric matrices have the particularity that the elements of the upper and lower triangular (discounting the diagonal) are equal. Then, simply calculate the upper triangular and then "copy" the value of the position "transposed" in the lower triangular.
From what little I remember of the theory of computational complexity, traditional matrix multiplication $A B = (m\times n)\times (n\times p) = (m\times p)$, has complexity $O(mpn)$. If $A$ and $B$ are square everything is reduced to $O(m^3)$.
For my scenario where $A$ is $m\times n$ and recognizing that the symmetric matrix $S = A A^t$ is square, the expression reduces to $O(m^2n)$, but I can also be symmetrical save half the calculations: $O ((m^2n)/2)$. Until there more or less I think I'm doing OK.
The point is that I implemented two algorithms and can not remember as we proceed to analyze T() and thus see what could be more optimal. Hoping that one could guide me on how to estimate T():
Algorithm 1:
for i = 1..m do:
for j = i..m do:
if i = j
then for k = 1..n do:
S(i,j) = S(i,j) + A(i,k) * A(i,k)
else begin
for k = 1..n do:
S(i,j) = S(i,j) + A(i,k) * A(j,k)
S(j,i) = S(i,j)
end
Algorithm 2:
for i = 1..m do:
begin
for k = 1..n do:
S(i,i) = S(i,i) + A(i,k) * A(i,k)
for j = i+1..m do:
begin
for k = i..n do:
S(i,j) = S(i,j) + A(i,k) * A(j,k)
S(j,i) = S(i,j)
end
end
Both by theory, do not exceed $O (m ^ 2n)$ obviously. When I try to analyze iteration cycles for the variable j, is where I get confused.