I heard a great quote by John von Neumann recently: "In mathematics, you don't understand things. You just get used to them."

This was right after trying to explain to someone, for the third or fourth time, in what sense there are more decimal numbers than fractions. And that there are the same number of fractions as counting numbers. I'm completely at a loss, having no way to visualize the issue but simply being used to it by now.

One type of infinity isn't bigger than the other type because one actually gets higher up on the number line. And it isn't bigger because it's "higher resolution" on the number line either; no matter how far you zoom in on the number line, you'll find more rational numbers and more irrational numbers. In fact this number line thing isn't the right way to think about sizes of infinity anyway: in any little segment of the number line, you can fit both kinds of infinity.

So my friend says to me, well I understand what you're saying, I just think about it as higher resolution. But, I insist, that's wrong! There's lots of mathematical intuition that people use to understand more complicated concepts that is wrong, and you might think it's useful for a few minutes, but it'll invariably lead you astray. Logic is a very precise thing. If you think you understand something, make sure your explanation doesn't also prove something false.

Anyway, back to different sizes of infinity. The correct sense with which to compare sizes of infinities is a one-to-one mapping. If you have two sets of numbers and you can match them up in pairs, so every number in each set is associated with exactly one number in the other set, those sets are the same size.

So there are the same number of fractions as counting numbers because you can list the fractions in order in a way that covers all of them. Then "1" matches with the first fraction in the list, "2" with the second, and so on. The list is created by first listing all the fractions, in reduced form, whose numerator and denominators sum to 0. Then the ones that sum to 1. Then 2, and so on. This list covers all of the fractions eventually. Like this:

[Fraction #1] 0

[Fractions #2 and #3] 1, -1

[Fractions #4 and $5] 2, -2

[Fractions #6-#9] 3, -3, 1/2, -1/2

[Fractions #10-#13] 4, -4, 1,3, -1/3

[Fractions #14-##21] 5, -5, 1/4, -1/4, 2/3, -2/3, 3/2, -3/2

...and so on.

But, there are other numbers that can't be represented with fractions. For those you have to use decimal representation, with infinitely many decimal places to the left of the decimal and infinitely many to the right (So 1 becomes ...000001.00000...). But you can't list these in order to correspond with the counting numbers: any such list will be missing another decimal, so no such list exists. (You can prove that by supposing that you have such a list, and find another decimal that's not in the list. If you build this new missing number by letting all the digits to the left of the decimal place be 0, and the first digit to the right of the decimal be the first digit of the first number in the list + 1, the second digit be the 2nd digit of the 2nd number in the list + 1 etc, then your new number differs from every other number on the list by one digit.)

So, there are infinite fractions, and infinite decimals, but the latter infinity is much much bigger than the first.

How do you explain that in visual terms??

(More mind-boggling is that there are actually many more sizes of infinity. Infinitely many more. But I forget how to construct those. But they're *not* just new dimensions tacked onto your number line. Yes that's right, you can draw a squiggle on an infinitely big piece of paper that will cover the entire paper, no matter how thin the tip of your pencil is. Math is cool.)