Mark Chu-Carrol has written this post discussing this silliness about “spooky” patterns in the digits of , and some other number, derived from . I’d recommend reading Mark CC’s piece, including the comments, which contain a discussion about what sounds like a non-silly use of this sort of pattern spotting that (it is claimed) has historically been used in mystical traditions – namely seeing what the patterns your brain pulls out tell you about your brain. The guy being discussed doesn’t do that. He thinks that the patterns were left by a (“Pythagorean“) god to tell us stuff. As one commenter pointed out, this is the sort of thinking that leads to murdering John Lennon.

Anyway, I thought it might be a good time to talk about the mathematics of patterns showing up everywhere, and how it is way cooler than any supernatural pseudo-explanation. One kind of “patterns are inevitable” result is Ramsey theory. Taking one of the simplest examples here, draw six points on a piece of paper (arranged as a regular hexagon, say), and then take a red pen and a blue pen. Now, draw lines (each with one of the two pens but changing pen whenever you wish) joining together each pair of dots. Amongst your lines there must be a triangle all of one colour. Furthermore if we have *any* picture we want to find amongst coloured lines joining dots (with any finite number of colours) we only need to insist that there be more than some given number of dots and we can be sure of it. There are many similar results about absolutely guaranteeing structure in a large enough finite set.

OK, well that’s combinatorics, something I don’t actually know to much about. Also, it’s not obviously related to patterns in . So let’s talk about measure theory and, since it’s more intuitive, let’s disguise it as probability. Consider the uniform distribution on an interval. We claim the following.

**1. Fix a finite string of digits (in a fixed base n) and choose a number at random from our interval. The probability that the number we choose contains our string (in its base n expansion) is 1.**

We can “bootstrap” this result up to the following infinitely awesome fact.

2. Choose a number at random from an interval. With probability 1, its expansion in *every* base will contain *every* finite string of characters in that base *infinitely many times*.

There are sketches of the proofs of these at the bottom of the page together with a note about what “probability 1” implies.

The stronger condition of normality (roughly that each string of a particular length comes up with the same frequency) is also true with probability 1.

**Why this is unimaginably cool** (more…)