For me, multiplication is a binary operation so it can be applied only on a finite sequence of numbers. but $1^{\infty}$ requires that we apply multiplication infinitly which is not defined as multiplication is a binary operation.
Is that a good reason? If not, what is the reason?
If my reason is ok, So similarly, $5^{\infty}$ is indeterminate ?
ِAdded:
I noticed that all answers are in context of "limits". Algebraically, multiplication is a binary operation, So it ONLY can be used to define multiplication of finite sequence of numbers as not a infinite sequence. So algebraically, what does $1^{\infty}$ even mean?
Similarly, $B=5^{\infty}, \ln b=\infty\ln5=\infty$
– lab bhattacharjee Oct 05 '14 at 09:16