I think your friend has gotten it through their head that a "real number" is DEFINED to be anything that can be written as a decimal.
This is simply wrong. That is not the definition. It is backwards. Eventually (but not now) it will be proven that if all numbers can be written as a decimal if you allow infinite decimals. DON'T worry about teaching your friend that step yet. It will just confuse them.
Instead decimals were invented (not discovered, invented) as a way to express numbers in a way that we can compare sizes of numbers that are not whole numbers.
So what IS a real number? It's any value. Period. That's all.
And $\frac 13$ is a value. You can divide things evenly. HOW you divide things evenly is another issue but that isn't relevant. $\frac 13$ is a very obvious value.
Now it is interesting that we can express $\frac 13$ as a fraction but not as a finite decimal. But that is not relevant. Decimals do not define numbers. Not being able to express $\frac 13$ didn't make $\frac 13$ disappear out of the universe. We can still talk about $\frac 13$. What is $\frac 13$ then if it isn't a number? A rabbit? A non-number? Ah, I know!.... A "fraction"; a magical beast like a chimera gryphon that doesn't actually exist.
Sorry. It is a real value. Hence it is a "number"
Decimals do not define numbers. They are ONE way to (inefficiently and incompletely[*]) describe numbers but they don't magically turn some values into numbers and magically turn other values into chimeras.
All values are real numbers. What else could they be?
=====
[*] Unless you allow infinite decimals. But that's a lesson for later.