3

I'm having a problem multiplicating 3 matrices in index notation. I know this should be trivial but I just can't figure it out. Is there any formula like $$\ A'_{\mu\nu} = M_{\mu}^{\ \rho}(M^{-1})_{\nu}^{\ \theta}A_{\rho\theta }$$ $\rightarrow$ $A'=MAM $ (M is diagonal if this changes anything and summation over identical indices is assumed (Einstein summation convention).)

For converting between matrix and index notation?, As far as i know the following relation holds $\ A'_{\mu \nu} = M_{\mu}^{\ \rho}M_{\nu}^{\ \theta}A_{\rho\theta }$ $\rightarrow$ $A'=MAM^{-1} $ as physicists use it in special relativity, but I can't derive this formula either because I just can't figure out where the inverse matrix comes into play. I of course already wrote down specific examples of the sum on the left hand side and the matrix multiplication on the right hand side, but it always seems to work without the inverse.

Any help, tip or link where the equivalence is shown explicitly will be much much appreciated because I'm stuck with this problem for quite a while.

jak
  • 1,575
  • It is not clear what you are trying to describe, nor what your notation means (I can figure there is an Einstein summation convention, but $(M_{\nu}^{\ \theta})^{-1}$ is the inverse of a matrix coefficient, not a coefficient of an inverse matrix, and this is positively weird; also your RHS seems to involve an inverse iff your LHS does not). Maybe a concrete example would help clarify. – Marc van Leeuwen Jan 13 '14 at 07:40
  • Hi, thanks for your remarks. I do not mean the inverse of the coefficent but the coefficents of the inverse matrix, so i changed the notation to make this clearer. The thing with the inverse on the RHS, if the LHS involves no inverse is exactly what i'm trying to figure out. The second formula cited above is wide spread in relativity/tensor calculus for the transformation formula of a second rank tensor. – jak Jan 13 '14 at 08:03
  • Judging from the placement indices you must be confusing inverse and transpose. Are you sure the second RHS is not $MAM^\top$? – Marc van Leeuwen Jan 13 '14 at 08:22
  • You're correct, in special relativity/tensor calculus we deal mostly with orthogonal transformations for which $M^T=M^{-1}$ holds, but i was hoping for a similar formular in a broader context. – jak Jan 13 '14 at 08:29
  • Do you have any source or explanation why $MAM^T$ holds? I think this would help me. – jak Jan 13 '14 at 08:35
  • See my answer here: https://math.stackexchange.com/questions/198257/intuition-for-the-product-of-vector-and-matrices-xtax/198280#198280 – kjetil b halvorsen Feb 22 '18 at 11:33

2 Answers2

3

Matrix multiplication with non-raised (i.e., not written as upper or lower) indices, the first index being the row index and the second the column index, is given by the rule $$ (AB)_{i,k}=\sum_jA_{i,j}B_{j,k}\tag1 $$ Now your second rule for transforming $A$ to $A'$ can be written (if you'll forgive me for using non-Greek letters as indices) $$ A'_{i,l}= \sum_{j,k} M_i^jA_{j,k}M_l^k,\tag2 $$ (I've inverted the indices in the LHS since I think you made a mistake: in your formula, if $M$ is the identity then $MAM^\top$ switches the indices, which cannot be right; with this proviso the correspondence is $\mu:=i$, $~\rho:=j$, $~\theta:=j$, $~\nu:=k$). Now if we agree to call the lower index of $M$ the first one and the upper index the second one, then in the right hand side of $(2)$, the second copy of $M$ has its indices switched with respect to what one would get by expanding out $MAM$ using $(1)$. So to get the indices in the right place one must transpose the second copy of $M$ before entering it into the matrix product: the RHS of $(2)$ describes the computation of $MAM^\top$.

1

A real answer cannot be given until you clarify your notation, but some tricks to help in translating between index and matrix notation are useful.

For instance, it is often helpful to note that premultiplying by a diagonal matrix corresponds to to row multiplications --- multiply row $i$ with diagonal element $i$, while postmultiplication with a diagonal matrix likewise corresponds to column multiplications. I have given more details here: Intuition for the Product of Vector and Matrices: $x^TAx $