Monday, September 14, 2020

How real are real numbers?

There is always one more counting number.

That is, no matter how high you count, you can always count one higher.  Or at least in principle.  In practice you'll eventually get tired and give up.  If you build a machine to do the counting for you, eventually the machine will break down or it will run out of capacity to say what number it's currently on.  And so forth.  Nevertheless, there is nothing inherent in the idea of "counting number" to stop you from counting higher.

In a brief sentence, which after untold work by mathematicians over the centuries we now have several ways to state completely rigorously, we've described something that can exceed the capacity of the entire observable universe as measured in the smallest units we believe to be measurable.  The counting numbers (more formally, the natural numbers) are infinite, but they can be defined not only by finite means, but fairly concisely.

There are levels of infinity beyond the natural numbers.  Infinitely many, in fact.  Again, there are several ways to define these larger infinities, but one way to define the most prominent of them, based on the real numbers, involves the concept of continuity or, more precisely, completeness in the sense that the real numbers contain any number that you can get arbitrarily close to.

For example, you can list fractions that get arbitrarily close to the square root of two: 1.4 (14/10) is fairly close, 1.41 (141/100) is even closer, 1.414 (1414/1000) is closer still, and if I asked for a fraction within one one-millionth, or trillionth, or within 1/googol, that is, one divided by ten to the hundredth power, no problem.  Any number of libraries you can download off the web can do that for you.

Nonetheless, the square root of two is not itself the ratio of two natural numbers, that is, it is not a rational number (more or less what most people would call a fraction, but with a little more math in the definition).  The earliest widely-recognized recorded proof of this goes back to the Pythagoreans.  It's not clear exactly who else also figured it out when, but the idea is certainly ancient.  No matter how closely you approach the square root of two with fractions, you'll never find a fraction whose square is exactly two.

OK, but why shouldn't the square root of two be a number?  If you draw a right triangle with legs one meter long, the hypotenuse certainly has some length, and by the Pythagorean theorem, that length squared is two.  Surely that length is a number?

Over time, there were some attempts to sweep the matter under the rug by asserting that, no, only rational numbers are really numbers and there just isn't a number that squares to two.  That triangle? Dunno, maybe its legs weren't exactly one meter long, or it's not quite a right triangle?

This is not necessarily as misguided as it might sound.  In real life, there is always uncertainty, and we only know the angles and the lengths of the sides approximately.  We can slice fractions as finely as we like, so is it really so bad to say that all numbers are rational, and therefore you can't ever actually construct a right triangle with both legs exactly the same length, even if you can get as close as you like?

Be that as it may, modern mathematics takes the view that there are more numbers than just the rationals and that if you can get arbitrarily close to some quantity, well, that's a number too.  Modern mathematics also says there's a number that squares to negative one, which has its own interesting consequences, but that's for some imaginary other post (yep, sorry, couldn't help myself).

The result of adding all these numbers-you-can-get-arbitrarily-close-to to the original rational numbers (every rational number is already arbitrarily close to itself) is called the real numbers.  It turns out that (math-speak for "I'm not going to tell you why", but see the post on counting for an outline) in defining the real numbers you bring in not only infinitely many more numbers, but so infinitely many more numbers that the original rational numbers "form a set of measure zero", meaning that the chances of any particular real number being rational are zero (as usual, the actual machinery that allows you to apply probabilities here is a bit more involved).

To recap, we started with the infinitely many rational numbers -- countably infinite since it turns out that you can match them up one-for-one with the natural numbers* -- and now we have an uncountably infinite set of numbers, infinitely too big to match up with the naturals.

But again we did this with a finite amount of machinery.  We started with the rule "There is always one more counting number", snuck in some rules about fractions and division, and then added "if you can get arbitrarily close to something with rational numbers, then that something is a number, too".  More concisely, limits always exist (with a few stipulations, since this is math).

One might ask at this point how real any of this is.  In the real world we can only measure uncertainly, and as a result we can generally get by with only a small portion of even the rational numbers, say just those with a hundred decimal digits or fewer, and for most purposes probably those with just a few digits (a while ago I discussed just how tiny a set like this is).  By definition anything we, or all of the civilizations in the observable universe, can do is literally as nothing compared to infinity, so are we really dealing with an infinity of numbers, or just a finite set of rules for talking about them?


One possible reply comes from the world of quantum mechanics, a bit ironic since the whole point of quantum mechanics is that the world, or at least important aspects of it, is quantized, meaning that a given system can only take on a specific set of discrete states (though, to be fair, there are generally a countable infinity of such states, most of them vanishingly unlikely).  An atom is made of a discrete set of particles, each with an electric charge that's either 1, 0 or -1 times the charge of the electron, the particles of an atom can only have a discrete set of energies, and so forth (not everything is necessarily quantized, but that's a discussion well beyond my depth).

All of this stems from the Schrödinger EquationThe discrete nature of quantum systems comes from there only being a discrete set of solutions to that equation for a particular set of boundary conditions.  This is actually a fairly common phenomenon.  It's the same reason that you can only get a certain set of tones by blowing over the opening of a bottle (at least in theory).

The equation itself is a partial differential equation defined over the complex numbers, which have the same completeness property as the real numbers (in fact, a complex number can be expressed as a pair of real numbers).  This is not an incidental feature, but a fundamental part of the definition in at least two ways: Differential equations, including the Schrödinger equation, are defined in terms of limits, and this only works for numbers like the reals or the complex numbers where the limits in question are guaranteed to exist.  Also, it includes π, which is not just irrational, but transcendental, which more or less means it can only be defined as a limit of an infinite sequence.

In other words, the discrete world of quantum mechanics, our best attempt so far at describing the behavior of the world under most conditions, depends critically on the kind of continuous mathematics in which infinities, both countable and uncountable, are a fundamental part of the landscape.  If you can't describe the real world without such infinities, then they must, in some sense, be real.


Of course, it's not actually that simple.

When I said "differential equations are defined in terms of limits", I should have said "differential equations can be defined in terms of limits."  One facet of modern mathematics is the tendency to find multiple ways of expressing the same concept.  There are, for example, several different but equivalent ways of expressing the completeness of the real numbers, and several different ways of defining differential equations.

One common technique in modern mathematics (a technique is a trick you use more than once) is to start with one way of defining a concept, find some interesting properties, and then switch perspective and say that those interesting properties are the actual definition.

For example, if you start with the usual definition of the natural numbers: zero and an "add one" operation to give you the next number, you can define addition in terms of adding one repeatedly -- adding three is the same as adding one three times, because three is the result of adding one to zero three times.  You can then prove that addition gives the same result no matter what order you add numbers in (the commutative property).  You can also prove that adding two numbers and then adding a third one is the same as adding the first number to the sum of the other two (the associative property).

Then you can turn around and say "Addition is an operation that's commutative and associative, with a special number 0 such that adding 0 to a number always gives you that number back."  Suddenly you have a more powerful definition of addition that can apply not just to natural numbers, but to the reals, the complex numbers, the finite set of numbers on a clock face, rotations of a two-dimensional object, orderings of a (finite or infinite) list of items and all sorts of other things.  The original objects that were used to define addition -- the natural numbers 0, 1, 2 ... -- are no longer needed.  The new definition works for them, too, of course, but they're no longer essential to the definition.

You can do the same thing with a system like quantum mechanics.  Instead of saying that the behavior of particles is defined by the Schrödinger equation, you can say that quantum particles behave according to such-and-such rules, which are compatible with the Schrödinger equation the same way the more abstract definition of addition in terms of properties is compatible with the natural numbers.

This has been done, or at least attempted, in a few different ways (of course).  The catch is these more abstract systems depend on the notion of a Hilbert Space, which has even more and hairier infinities in it than the real numbers as described above.


How did we get from "there is always one more number" to "more and hairier infinities"?

The question that got us here was "Are we really dealing with an infinity of numbers, or just a finite set of rules for talking about them?"  In some sense, it has to be the latter -- as finite beings, we can only deal with a finite set of rules and try to figure out their consequences.  But that doesn't tell us anything one way or another about what the world is "really" like.

So then the question becomes something more like "Is the behavior of the real world best described by rules that imply things like infinities and limits?"  The best guess right now is "yes", but maybe the jury is still out.  Maybe we can define a more abstract version of quantum physics that doesn't require infinities, in the same way that defining addition doesn't require defining the natural numbers.  Then the question is whether that version is in some way "better" than the usual definition.

It's also possible that, as well-tested as quantum field theory is, there's some discrepancy between it and the real world that's best explained by assuming that the world isn't continuous and therefore the equations to describe it should be based on a discrete number system.  I haven't the foggiest idea how that could happen, but I don't see any fundamental logical reason to rule it out.

For now, however, it looks like the world is best described by differential equations like the Schrödinger equation, which is built on the complex numbers, which in turn are derived from the reals, with all their limits and infinities.  The (provisional) verdict then: the real numbers are real.


* One crude way to see that the rational numbers are countable is to note that there are no more rational numbers than there are pairs of numerator and denominator, each a natural number.    If you can count the pairs of natural numbers, you can count the rational numbers, by leaving out the pairs that have zero as the denominator and the pairs that aren't in lowest terms.  There will still be infinitely many rational numbers, even though you're leaving out an infinite number of (numerator, denominator) pairs, which is just a fun fact of dealing in infinities.  One way to count the pairs of natural numbers is to put them in a grid and count along the diagonals: (0,0), (1,0), (0,1), (2,0), (1,1), (0, 2), (3,0), (2,1), (1,2), (0,3) ... This gets every pair exactly once.

All of this is ignoring negative rational numbers like -5/42 or whatever, but if you like you can weave all those into the list by inserting a pair with a negative numerator after any pair with a non-zero numerator: (0,0), (1,0), (-1,0) (0,1), (2,0), (-2, 0), (1,1), (-1,1) (0, 2), (3,0), (-3, 0) (2,1), (-2, 1), (1,2), (-1,2) (0,3) ... Putting it all together, leaving out the zero denominators and not-in-lowest-terms, you get (0,1), (1,1), (-1, 1),(2,1),(-2,1),(1,2),(-1,2),(3,1),(-3,1),(1,3),(-1,3) ...

Another, much more interesting way of counting the rational numbers is via the Farey Sequence.

1 comment:

  1. Nice. An example of "either X, or not X" where neither is comprehensible (e.g., either time goes on forever, or it doesn't).

    This proves that reality has little to do with what we can comprehend.

    ReplyDelete