A great deal of time has been spent working out the facts about pi. At one point, the "facts" were wrong. And a mathematician caught the error by using something that, well, might also be wrong.

Calculating the value of pi has proved a headache for many a mathematician. Today, we leave it up to the computers, as well we should. It's a detailed and laborious process that can, and has, taken years to discover and work through. Which isn't to say no one tried. For hundreds of years, people tried to work pi out to a greater and greater degree of accuracy. The record, up until 1853, was 440 decimal places. William Shanks, with a great deal of work, shattered the record, taking pi out to 707 decimal places.

Or did he? Another mathematician Augustus de Morgan, took a look at Shanks' work and found it faulty. He hadn't calculated pi. He hadn't even tried. He'd just calculated the number of times each numeral appeared in the value for pi that Shanks had published. If the numbers were appearing at random, and evenly distributed throughout pi's decimal places, the number 7 would appear 61 times. It appeared a mere 44 times. It was underrepresented.

The math world could have turned and said, "So what?" Some mathematicians did. Nothing had been proved to show that all the digits - 0 through 9 - should appear equally in any calculation of pi. De Morgan, however, couldn't shake the idea that Shanks was wrong. The effort to calculate pi was so herculean that neither lived to see the resolution of their dispute. In 1946, a computer worked away at a calculation of pi, and found that Shanks went wrong at the 528th decimal place. So he still beat the record, but by far less than he thought he had. This is interesting, as the idea that all the numerals should appear equally in the decimal places after pi has still not been demonstrated. De Morgan used something that, for all we know, is wrong, to cast correct doubt on something that for all anyone knew was right.

**[Via ****Magnificent Mistakes in Mathematics****, ****The Penguin Dictionary of Curious and Interesting Numbers****]**

## DISCUSSION

Isn't that the same logical fallacy that we recently discussed here? Random is random, isn't there no such thing as random numbers being "equally represented"?

Disclaimer: I can barely add single-digit numbers, so I'll be bowing out of this conversation early.