I'm not entirely sure if this is the best answer for this question, because the $\text{Row}\cdot\text{Column}$ method has many important implications and is unavoidable if you carry on for more advanced uses of it, and those ideas might be a better way of dealing with the question.
But what happens if matrix multiplication was defined a different way? That is a good point to start for feeling comfortable with any topic.
Arthur Cayley wrote a book in 1857 called “A Memoir on the Theory of Matrices” which is believed to have the earliest printed description of matrix multiplication.
And he uses an unconventional matrix multiplication method which might look a lot more intuitive for a beginner. He starts with stating that a matrix is used to describe a set of equations like:
$ax+by+cz = X$
$a'x+b'y+c'z = Y$
$a''x+b''y+c''z=Z$
And then describes that matrix as such:
$\begin{pmatrix} a&b&c\\a'&b'&c'\\a''&b''&c'' \end{pmatrix} \begin{pmatrix} x&y&z \end{pmatrix} = \begin{pmatrix} X&Y&Z \end{pmatrix} = \begin{pmatrix} (a,b,c)(x,y,z)&(a',b',c')(x,y,z)&(a'',b'',c'')(x,y,z) \end{pmatrix}$
It seems more understandable. You take rows and multiply it with corresponding row elements (and then add them up). In fact, that’s similar to how we define addition.
But then Cayley introduces another vector
$\begin{pmatrix} \xi & \eta & \zeta \end{pmatrix}$ and then he states that,
$\begin{pmatrix} x&y&z \end{pmatrix} = \begin{pmatrix} \alpha & \beta & \gamma \\ \alpha ' & \beta ' & \gamma ' \\ \alpha '' & \beta '' & \gamma '' \end{pmatrix} \begin{pmatrix} \xi & \eta & \zeta \end{pmatrix}$
So the original $\begin{pmatrix} X&Y&Z \end{pmatrix}$ can also be seen as,
$\begin{pmatrix} X&Y&Z \end{pmatrix} = \begin{pmatrix} A&B&C\\A'&B'&C'\\A''&B''&C'' \end{pmatrix} \begin{pmatrix} \xi & \eta & \zeta \end{pmatrix}$
Which is great because if we actually expand both of the $abc$ and $\alpha \beta \gamma$ matrices into linear equations and then compare that with the
$A B C$ matrix, we can calculate what the multiplication is. Or, in other words,
$\begin{pmatrix} a&b&c\\a'&b'&c'\\a''&b''&c'' \end{pmatrix} \begin{pmatrix} \alpha & \beta & \gamma \\ \alpha ' & \beta ' & \gamma ' \\ \alpha '' & \beta '' & \gamma '' \end{pmatrix} = \begin{pmatrix} A&B&C\\A'&B'&C'\\A''&B''&C'' \end{pmatrix}$
I won’t do that here, because it’s a $3 \times 3$ matrix which I think would be messy. Instead, let’s try it on a $2 \times 2$. Taking,
$\begin{pmatrix} X&Y \end{pmatrix} = \begin{pmatrix} a&b\\a'&b' \end{pmatrix} \begin{pmatrix} x&y \end{pmatrix} = \begin{pmatrix} (a,b)(x,y)&(a',b')(x,y,) \end{pmatrix}$
And,
$\begin{pmatrix} x&y \end{pmatrix} = \begin{pmatrix} \alpha & \beta \\ \alpha ' & \beta ' \end{pmatrix} \begin{pmatrix} \xi & \eta \end{pmatrix}$
Which gives you,
$\begin{pmatrix} (a,b)(x,y)&(a',b')(x,y,) \end{pmatrix} = \begin{pmatrix} (a,b)(\alpha \xi+ \beta \eta, \alpha ' \xi + \beta ' \eta)&(a',b')(\alpha \xi'+ \beta \eta', \alpha ' \xi' + \beta ' \eta') \end{pmatrix}$
Or:
$\begin{pmatrix} a(\alpha \xi+ \beta \eta) +b (\alpha ' \xi + \beta ' \eta) & a'(\alpha \xi'+ \beta \eta') + b' (\alpha ' \xi' + \beta ' \eta') \end{pmatrix} $
$= \begin{pmatrix} (a\alpha + b\alpha', a\beta + b\beta')(\xi, \eta) & (a'\alpha +b'\alpha', a'\beta+b'\beta')(\xi, \eta) \end{pmatrix}$
Which implies:
$\begin{pmatrix} a & b \\ a'&b' \end{pmatrix} \begin{pmatrix} \alpha & \beta \\ \alpha' & \beta' \end{pmatrix} = \begin{pmatrix} a\alpha +b\alpha' & a\beta +b\beta' \\ a'\alpha +b'\alpha' & a'\beta +b'\beta' \end{pmatrix}$
And that’s the matrix multiplication we already use!
This exactly follows in the Cayley’s example, too, as you’ll find in his book.[1]
A much more interesting question is whether this follows in two compound matrices. What if we took two compound matrices, and defined the second one to be factored into two more compound matrices; and then attempted to see how we could factor the original multiplication into the last compound matrix? It’s a mouthful, but it’s what we did in this answer except we introduce more terms in it.
Well, I tried it and it does work out in the same way to lead you to this convention again — in one of the methods. (There are two ways of defining a non-conventional "row $\cdot$ row" multiplication. And while one of them does give the same result as this, the other one is inconsistent with itself!)
[1]: Arthur Cayley's A Memoir of the Theory of Matrices http://scgroup.hpclab.ceid.upatras.gr/class/LAA/Cayley.pdf (Royal Society Publishing: 2010)