Despite what some teachers like to pretend, there is no universally observed "order of operations" convention for sequences of multiplication and division operators.
This is particularly true when the $\div$ symbol is used. Since this symbol doesn't generally appear at all in "serious" mathematical writing, there's little chance for people to converge on a definite convention for how to use it in complex situations.
Professional mathematicians generally prefer fraction notation for division; in that case the grouping of the operands is always unambiguous. For space efficiently, mathematicians also use a forward slash $x/y$, when the formula would otherwise be undesirably tall. With a forward slash there's general agreement (or so I thought; see comments for a dissent) that multiplication notated as juxtaposition of factors binds tighter than division by slash, so $x/2y$ is treated as just a typographical variant of $\frac{x}{2y}$. But you definitely cannot rely on such a convention if the division is written with $\div$ instead of $/$. Then all bets on how a random reader will interpret it are off again.
This partial agreement disappears completely if the multiplication is written with an explicit sign, either $\cdot$ or $\times$. You will find reasonable people disagreeing about how they'd interpret $x/2{\cdot}y$, for example.
Programming languages, as well as computer systems designed by programmers, generally tend to use *
and /
as multiplication and division signs. They will usually interpret a/b*c
as $\frac ab\cdot c$ -- not because that is really an established convention in mathematics, but merely because it has to mean something (it is seen as uncharitable just to reject the input a/b*c
as nonsense, even though a case could be made that it should have been). At last a/b*c=(a/b)*c
is not clearly worse than the opposite interpretation, and also matches how the similar ambiguity in $a-b+c$ is (by firm mathematical convention) resolved.
Note well that this is not really a dispute about the underlying mathematics -- it is purely about how we write down our mental idea of a particular calculation with ink on paper, such that a reader can reconstruct the same idea in their head, if they know the notations and conventions being used.
When you write formulas yourself, do your reader a favor and put in explicit parentheses to make it clear which calculation you mean to specify if you're using a notation where this is not backed by a strong convention.