Usually definablility refers to a fixed language (often first-order, but not necessarily), in a fixed interpretation for the language (also called a structure). The meaning is that there is a formula (which is by definition a finite string of characters) which is only satisfied by a unique element of the structure.
Just writing digits, multiplication, "etc." doesn't quite tell us what the language is. Moreover, by allowing an infinite definition you easily can write each real number as an integer along with an infinite decimal expansion. You don't even have to appeal to $\sqrt{}$ or $\pi$ or the "etc." part. Just note that a decimal expansion fully decides the real number's value. And although some real numbers have several decimal expansions (e.g. $1=1.0=1.000=0.\overline{999}$) remember that $2=1+1=3-1=4-2=5-3$ and so on. So having several definitions is not a big deal.
Finally, algorithm is a finite sequence from a finite possible list of operations. So there are only countably many algorithms to begin with, therefore the majority of real numbers are such that there is no algorithm to compute them (or their decimal expansions, if you prefer to think about it that way). Of course that you can talk about infinitely many possible operations, or infinitely long computations. The answer may change, depending on the exact definitions and set theoretical intricacies.