-1

For example, $$\frac{1}{1}=1\quad \frac{1}{2}=0.5\quad \frac{1}{3}=0.\overline3\quad \frac{1}{10}=0.1$$ so the larger the denominator is, the smaller the number is.

Would this mean that $\frac{1}{\infty}=0$, or what else could it be?

Also, $$\frac{1}{0.5}=2\quad \frac{1}{0.\overline3}=3\quad \frac{1}{0.1}=10\quad \frac{1}{0.001}=1000$$ and so on. As the numbers in the denominator get smaller, the value of the answer gets larger.

This leads me to the conclusion that $\frac{1}{0}=\infty$. Would this be correct?

Dr. Mathva
  • 8,621
  • 4
    Short answer: no. Any value you try will destroy some important algebraic identity - usually the distributive law. There are many questions on this site asking the same thing. https://math.stackexchange.com/questions/260876/what-exactly-is-infinity – Ethan Bolker Feb 12 '19 at 16:24
  • Your examples sort of answer the question. It's that infinity itself is a concept of limiting behavior. There is no value to give to infinity, because by definition it is the characterization of "what happens when you get closer and closer to something" – NazimJ Feb 12 '19 at 16:29