First of all, let's get the definitions straight:
A basis $b_1, \dots, b_n$ of an innerproduct space is called orthogonal if each pair $b_i, b_j$ with $i \neq j$ is orthogonal, i.e., if $\langle b_i, b_j \rangle = 0$.
A basis $b_1, \dots, b_n$ of an innerproduct space is called orthonormal if it is orthogonal and every $b_i$ is a unit vector, i.e., if $\langle b_i, b_j \rangle = \delta_{ij}$.
A square matrix $(A_{ij})_{1 \leq i,j \leq n}$ over an innerproduct space is called orthogonal if its columns and rows both form orthonormal bases.
Now, if you work out what it means for the columns of $A$ to be orthogonal, then that comes out as $A^T A = I_n$. And, the rows of $A$ are orthogonal if and only if $A A^T = I_n$. Both are equivalent to $A$ is invertible and $A^{-1} = A^T$.
So, if a square matrix satisfies $A A^T = I_n$ (i.e., its rows form an orthonormal basis), then $A^{-1} = A^T$. Therefore, also $A^T A = I_n$ (i.e., its columns form an orthonormal basis) and hence $A$ is an orthogonal matrix.
(Of course, the argument is now hidden in the fact that a left-inverse of a square matrix is automatically a right-inverse and vica-versa. For that, see for instance If $AB = I$ then $BA = I$)