0

I have seen proofs showing why the inverse of $AA^\top$ exists but what are the conditions on $A$ in order for $(AA^\top)^{-1}$ to exist?

Additionally, how can we show/prove that the columns of $A^\top A$ are linearly independent if and only if the columns of $A$ are linearly independent?

Theo Bendit
  • 50,900
  • Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made. – Theo Bendit Dec 12 '18 at 23:52
  • Also, your first question is confusing. You've seen proofs showing why $(AA^\top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven? – Theo Bendit Dec 12 '18 at 23:54
  • https://math.stackexchange.com/questions/1151491/under-what-conditions-is-aat-invertible – T. Fo Dec 13 '18 at 00:49

1 Answers1

1

There are different ways to prove the statement, here I use singular value decomposition.

Note on notation: I assume matrix $A_{n \times m}$, with $n \geq m$ to be consistent with most of litaratures. Note this gives you $A^{\intercal}A$ invertible (not $A A^{\intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{\intercal}$ here.

Any arbitrary $n \times m$ matrix $A_{n \times m}$ ($n \geq m$) admits singular value decomposition,

\begin{equation} A_{n \times m} = U_{n \times m} \Sigma_{m \times m} V_{m \times m}^{\intercal} \end{equation} where $\Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{\intercal} U = I_{n \times n}$ and $V^{\intercal} V = I_{m \times m}$, and $I$ is identity matrix.

Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $\Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $\Sigma$ (the singular values) are non-zero.

Construct

\begin{equation} G_{m \times m} = A^{\intercal} A = V_{m \times m} \Sigma_{m \times m}^2 V_{m \times m}^{\intercal} \end{equation} Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix \begin{equation} \Lambda_{m \times m} = \Sigma_{m \times m}^2. \end{equation} Since the eigenvalues of $G$ (diagonals of $\Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.

Now, construct \begin{equation} H_{n \times n} = A A^{\intercal} = U_{n \times n} \begin{bmatrix} \Sigma_{m \times m}^2 & O_{m \times (n-m)} \\ O_{(n-m) \times m} & O_{(n-m) \times (n-m)} \end{bmatrix} U_{n \times n}^{\intercal} \end{equation} This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues

\begin{equation} \Lambda_{n \times n} = \begin{bmatrix} \Sigma_{m \times m}^2 & O_{m \times (n-m)} \\ O_{(n-m) \times m} & O_{(n-m) \times (n-m)} \end{bmatrix}. \end{equation}

If $n > m$, then some of the eigenvalues of $H$ (diagonals of $\Lambda_{n \times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $\Lambda_{n \times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $\Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.

Sia
  • 489
  • 1
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows. – Misha Lavrov Dec 13 '18 at 01:27