Denote by $\sigma$ the spectral radius.
Is it true that $\sigma(AB) =\sigma(BA) $?
Edit: I am interested in the general case, i.e. $A$ is $n \times k$ and $B$ is $k \times n$.
Denote by $\sigma$ the spectral radius.
Is it true that $\sigma(AB) =\sigma(BA) $?
Edit: I am interested in the general case, i.e. $A$ is $n \times k$ and $B$ is $k \times n$.
If $A$ and $B$ are square matrices of the same size, then the products $AB$ and $BA$ have the same eigenvalues.
As was noted in comments by @Algebraic Pavel, the result still holds for rectangular matrices $A$ and $B$ (if the products $AB$ and $BA$ make sense). The non-zero eigenvalues of these products are the same.
$\sigma_p$ denotes here the set of eigenvalues. For matrices it is the same as $\sigma$ (spectrum). A $R^{m\times n}$ matrix is a linear operator $R^n\to R^m$; similarly for $C^{m\times n}$, so the following answers to you.
Theorem. Let $X,Y$ be vector spaces (with the same scalar field $R$ or $C$). Let $A:X\to Y$ and $B:Y\to X$ be linear.
(a) Then $\sigma_p(AB)\setminus\{0\} = \sigma_p(BA)\setminus\{0\}$.
(b) If both $X$ and $Y$ are $n$-dimensional, $n<\infty$, then $\sigma_p(AB) = \sigma_p(BA)$.
Proof: (a) Let $0\ne t\in \sigma_p(AB)$. Then $AB y=ty$ for some $y\in Y\setminus\{0\}$. Right-multiply by $B$ to get $BA(By)=t(By)$. Thus, $t\in\sigma_p(BA)$ (because $By\ne0$, as else $0=AB y=ty$). Therefore, $\sigma_p(AB)\setminus\{0\} \subset \sigma_p(BA)\setminus\{0\}$. Exchange $A$ with $B$ to get $\sigma_p(AB)\setminus\{0\} \supset \sigma_p(BA)\setminus\{0\}$.
(b) Assume $\dim X=n=\dim Y$. If $ABy=0$ for some $y\ne 0$, then $A$ or $B$ is singular (as $\det(AB)=\det(A)\det(B)$); hence so is then $BA$. Thus, $0\in\sigma_p(AB) \Rightarrow 0\in\sigma_p(BA)$. Exchange $A,B$. QED.
Remarks. (a) By the above proof, (a) holds for bounded and even for unbounded operators (as long as they are defined on the whole space, as assumed in the theorem) over infinite-dimensional vector spaces.
(b1) Claim (b) is not true for non-square matrices. For example, $0\in\sigma(AB)\setminus\sigma_p(BA)$ if $A^T=(1,0)=B$, as then $BA=1$, $AB=(1,0; 0,0)$.
(b2) Claim (b) is not true for $n=\infty$. For example, let L, R be the left and right shifts on $R^N$, the set of sequences $(x_1,x_2,\cdots)$. Then $LR=I$ but $RL(1,0,0,0,\ldots)=R0=0$, so $0\in\sigma_p(RL)\setminus\sigma_p(LR)$.
(c) In the infinite-dimensional case, usually $\sigma_p$ denotes the eigenvalues and $\sigma$ is typically a bigger set (in many books all $t$ for which $t-A$ is not a boundedly invertible bijection), but for (square) matrices $\sigma=\sigma_p$.
Better answers and more information can be found here (e.g., for matrices the eigenvalues are the same even counting multiplicities, which would not require much extra work in the above proof, either): https://math.stackexchange.com/a/124903/426834