An integer is either a perfect square or its square root is irrational. Said a different way, when you compute the square root of an integer, there are either no figures to the right of the decimal or there are an infinite number of figures to right of the decimal and they don’t repeat. There’s no middle ground. You can’t hope, for example, that the decimal expansion might stop or repeat after a hundred or so terms.
This theorem came up recently when I was talking to one of my kids about her math class, so I decided to look up the proof. It’s easier than I expected, not much harder than the familiar proof that the square root of 2 is irrational. Here goes.
Suppose a/b is a fraction in lowest terms, i.e. a and b are relatively prime, and a/b a solution to xn = c where n > 0 is an integer and c is an integer. Then
(a/b)n = an / bn = c
an / b = c bn-1.
Now the right side of the equation above is an integer, so the left side must be an integer as well. But b is relatively prime to a, and so b is relatively prime to an. The only way an / b could be an integer is for b to equal 1 or -1. And so a/b must be an integer.
This proof shows that what we said about square roots extends to cube roots and in fact to all integer roots. For example, the fifth root of an integer is either an integer or an irrational number.
Update: Note that there’s another proof in the comments, one that I believe is easier to follow.