Well, contrarywise, is there any reason why $\pi$ should be a rational number? A rational number just means it's expressible as a ratio between two whole numbers. Is there a reason that $\pi$ (or any other arbitrary value) need to be expressible a ratio of two whole numbers?
Well, I can think of two intuitive (but wrong) reasons. But I want to express that, in a way, irrational numbers intuitively make more sense than rational numbers.
Consider distance and space. Surely it is a continuum and every possible distance is going to exist and they are infinite. There aren't any holes and jumps between points. So let's say you put up a sign here and say "This is here!" and one mile you put up a sign saying "this is one mile" and you put sign post each exactly splitting the mile into 6 parts so you have posts every 1/6 mile.
Now take a pea-shooter and shoot it anywhere randomly to the mile line. Where does the pea land. Well, if it hit exactly one of the sign posts that'd be kind of unlikely. So says someone (there's always someone)we can split the distance into smaller pieces, into 60ths of miles 120ths miles, thousands of miles, 573th of a mile. But is there any reason why the pea should match up perfectly with any of those? In fact doesn't it seem unlikely?
Okay, here come Pythagoras just walking down the street (what's he doing here? Don't ask) and he says "Everything is whole numbers; there must be some precise number that cuts this mile to precisely that point." And you say to him "Why? Where did you get that idea? Why should that be?" And he basically says it'd make life perfect and nice and it'd be religiously beautiful if it were so.
Well, okay, but what about $\sqrt 2$ you say. He gets a dirty look and suddenly you are very, very glad you are not on a boat.
Okay, that was a fantasy. I think in a naive and simplistic way it's intuitive to think that because we can split an apple into rational number parts, it should work the other way, and we should be able to take any value and find the parts that broke it away from the whole. But we shouldn't assume naive ideas and if we view numbers as continuum distance, instead of discrete apples and distinct apple slices. It shouldn't be intuitive anymore.
Consider decimals. To get from 3.7 to 3.8 you have to pass through 3.75 first and to get to 3.75 you must go through 3.726 and each precision deeper there are infinite degrees of precision. As we swim past them we are going from point to point continually. The Pythagorases of the world would have as jumping from descrete whole number slice to whole number slice. There isn't any reason to think that this is the correct way to view this. And, it turns out, it isn't the correct way to view.
What is reality? Is it swimming through a continuum, or is it jumping from precise knife cut to knife cut?
So numbers are a continuum and there exist infinitely many numbers that simply aren't a discrete j/k amount for a precise whole number of exactly $1/k$ slices. And why shouldn't there be?