Define $n\times n$ matrices $L , M$ as follows: $\,$ L$_{ij}$= $1$ if $\,$i + j $\geq$n + 1 and is zero otherwise; $\,$ $\qquad$ $\qquad$ $\quad$M$_{ij}$= $\min(i,j)$ for $1 \leq i , j \leq n$ . It is straightforward to check that $M = L^2$ . For example, when $n=3$ we have
$\qquad$ $\qquad$ $\qquad$$\begin{bmatrix}1 & 1 & 1\\1 & 2 & 2\\1 & 2 & 3\\ \end{bmatrix}$ = $\begin{bmatrix}0 & 0 & 1\\0 & 1 & 1\\1 & 1 & 1\\ \end{bmatrix}^2$ .
[Remarks: $\det(M) = 1$ ; $M$ is positive definite but $L$ is not ; the eigenvalues of $M$ are distinct ( to see this, it's easier to work with $M^{-1}$ which is tridiagonal ). ]
General theory then predicts that $L$ can be written as a polynomial in $M$. A priori, this polynomial might be expected to have algebraic numbers as coefficients, but the examples below suggest that the situation may be nicer than that. First, when $n=2$ we find by inspection that $L = M - I.$ Next, for $n=3$, a calculation shows that $L = M^2 - 5M + 2I$. And yet again for $n=4$, we find $L = 2M^3 - 19M^2 + 21M - 5I.$
In each case (so far) the polynomial turned out to have integral coefficients.
Questions: (1) Is it always true, for arbitrary $n$ , that $L$ can be expressed as a polynomial in $M$ with integral coefficients?
(2) Presumably, matrices like $M$ above which have an integral square root are the exception rather than the rule. Is it easy to show that almost all ( in any suitable sense) matrices in $\text{SL}(n,Z)$ do not possess a square root in $\text{GL}(n,Z)$ whenever $n \geq 2$ ?