Flipping Wyoming

state border signs for the dozen states from MA to CA on I-80 (there is no "wlecome to Nebraska" sign, so I made do with "welcome to Omaha"

Last week I spent five days in the driver’s seat, crossing the country from east to west, mostly on Interstate 80. I’ve made the trip before, though never on this route. In particular, the 900-mile stretch from Lincoln, Nebraska, across the southern tier of Wyoming, and down to Salt Lake City was new to me.

Driving is a task that engages only a part of one’s neural network, so the rest of the mind is free to wander. On this occasion my thoughts took a political turn. After all, I was boring through the bright red heart of America. Especially in Wyoming.

Based on the party affiliations of registered voters, Wyoming is far and away the most Republican state in the union, with the party claiming the allegiance of two-thirds of the electorate. The Democrats have 18 percent. A 2013 Gallup poll identified Wyoming as the most “conservative” state, with just over half those surveyed preferring that label to “moderate” or “liberal.”

The other singular distinction of Wyoming is that it has the smallest population of all the states, estimated at 579,000. The entire state has fewer people than many U.S. cities, including Albuquerque, Milwaukee, and Baltimore. The population density is a little under six people per square mile.

I looked up these numbers while staying the night in Laramie, the state’s college town, and I was mulling them over as I continued west the next morning, climbing through miles of rolling grassland and sagebrush with scarcely any sign of human habitation. A mischievous thought came upon me. What would it take to flip Wyoming? If we could somehow induce 125,000 liberal voters to take up legal residence here, the state would change sides. We’d have two more Democrats in the Senate, and one more in the House. Berkeley, California, my destination on this road trip, has a population of about 120,000. Maybe we could persuade everyone in Berkeley to give up Chez Panisse and Moe’s Books, and build a new People’s Republic somewhere on Wyoming’s Medicine Bow River.

Let me quickly interject: This is a daydream, or maybe a nightmare, and not a serious proposal. Colonizing Wyoming for political purposes would not be a happy experience for either the immigrants or the natives. The scheme belongs in the same category as a plan announced by a former Mormon bishop to build a new city of a million people in Vermont. (Vermont has a population of about 624,000, the second smallest among U.S. states.)

Rather than trying to flip Wyoming, maybe one should try to fix it. Why is it the least populated state, and the most Republican? Why is so much of the landscape vacant? Why aren’t entrepreneurs with dreams of cryptocurrency fortunes flocking to Cheyenne or Casper with their plans for startup companies?

The experience of driving through the state on I-80 suggests some answers to these questions. I found myself wondering how even the existing population of a few hundred thousand manage to sustain itself. Wikipedia says there’s some agriculture in the state (beef, hay, sugar beets), but I saw little evidence of it. There’s tourism, but that’s mostly in the northwest corner, focused on Yellowstone and Grand Teton national parks and the cowboy-chic enclave of Jackson Hole. The only conspicuous economic activity along the I-80 corridor is connected with the mining and energy industries. My very first experience of Wyoming was olfactory: Coming downhill from Pine Bluffs, Nebraska, I caught of whiff of the Frontier oil refinery in Cheyenne; as I got closer to town, I watched the sun set behind a low-hanging purple haze that might also be refinery-related. The next day, halfway across the state, the Sinclair refinery announced itself in a similar way.

Sinclair refinery in Sinclair, Wyoming

Still farther west, coal takes over where oil leaves off. The Jim Bridger power plant, whose stacks and cooling-tower plumes are visible from the highway, burns locally mined coal and exports the electricity.

Jim Bridger power plant 5582

As the author of a book celebrating industrial artifacts, I’m hardly the one to gripe about the presence of such infrastructure. On the other hand, oil and coal are not much of a foundation for a modern economy. Even with all the wells, the pipelines, the refineries, the mines, and the power plants, Wyoming employment in the “extractive” sector is only about 24,000 (or 7 percent of the state’s workforce), down sharply from a peak of 39,000 in 2008. If this is the industry that will build the state’s future, then the future looks bleak.

Economists going all the way back to Adam Smith have puzzled over the question: Why do some places prosper while others languish? Why, for example, are Denver and Boulder so much livelier than Cheyenne and Laramie? The Colorado cities and the Wyoming ones are only about 100 miles apart, and they share similar histories and physical environments. But Denver is booming, with a diverse and growing economy and a population approaching 700,000—greater than the entire state of Wyoming. Cheyenne remains a tenth the size of Denver, and in Cheyenne you don’t have to fight off hordes of hipsters to book a table for dinner. What makes the difference? I suspect the answer lies in a Yogi Berra phenomenon. Everybody wants to go to Denver because everyone is there already. Nobody wants to be in Cheyenne because it’s so lonely. If this guess is correct, maybe we’d be doing Wyoming a favor by bringing in that invasion of 125,000 sandal-and-hoodie–clad bicoastals.

sign at the continental divide, elevation 7000One more Wyoming story. At the midpoint of my journey across the state, near milepost 205 on I-80, I passed the sign shown at left. I am an aficionado of continental divide crossings, and so I took particular note. Then, 50 miles farther along, I passed another sign, shown at right. continental divide sign at elevation 6930 On seeing this second crossing, I put myself on high alert for a third such sign. This is a matter of simple topology, or so I thought. If a line—perhaps a very wiggly one—divides an area into two regions, then if you start in one region and end up in the other, you must have crossed the line an odd number of times. Shown below are some possible configurations. three possible ways of crossing a wiggly continential divideIn each case the red line is the path of the continental divide, and the dashed blue line is the road’s trajectory across it. At far left the situation is simple: The road intersects the divide in a single point. The middle diagram shows three crossings; it’s easy to see how further elaboration of the meandering path could yield five or seven or any odd number of crossings. An arrangement that might seem to generate just two crossings is show at right. One of the “crossings” is not a crossing at all but a point of tangency. Depending on your taste in such matters, the tangent intersection could be counted as crossing the divide twice or not at all; in either case, the total number of crossings remains odd.

In the remainder of my trip I never saw a sign marking a third crossing of the divide. The explanation has nothing to do with points of tangency. I should have known that, because I’ve actually written about this peculiarity of Wyoming topography before. Can you guess what’s happening? Wikipedia tells all.

Posted in mathematics, modern life, social science, technology | Leave a comment

Sir Roger Penrose’s Toilet Paper

Penrose Tiling Rhombi Wikimedia
Twenty years ago, Kimberly-Clark, the Kleenex company, introduced a line of toilet paper embossed with the kite-and-dart aperiodic tiling discovered by Roger Penrose. When I first heard about this, I thought: How clever. Because the pattern never repeats, the creases in successive layers of a roll would never line up over any extended region, and so the sheets would be less likely to stick together.

Sir Roger Penrose had a different response. Apparently be believes the pattern is subject to copyright protection, and he also managed to get a patent issued in 1979, although that would have expired about the time of the toilet paper scandal. Penrose assigned his rights to a British company called Pentaplex Ltd. An article in the Times of London quoted a representative of Pentaplex:

So often we read of very large companies riding roughshod over small businesses or individuals, but when it comes to the population of Great Britain being invited by a multinational [company] to wipe their bottoms on what appears to be the work of a knight of the realm without his permission, then a last stand must be made.

Sir Roger sued. I haven’t been able to find a documented account of how the legal action was resolved, but it seems Kimberly-Clark quickly withdrew the product.

Some years ago I was given a small sample of the infamous Penrose toilet paper. It came to me from Phil and Phylis Morrison; a note from Phylis indicates that they acquired it from Marion Walter. Now I would like to pass this treasure on to a new custodian. The specimen is unused though not pristine, roughly a foot long, and accompanied by a photocopy of the abovementioned Times news item. In the photograph below I have boosted the contrast to make the raised ridges more visible; in real life the pattern is subtle.

Penrose tiled toilet paper  enhanced

Are you interested in artifacts with unusual symmetries? Would you like to add this object to your collection? Send a note with a U.S. mailing address to brian@bit-player.org. If I get multiple requests, I’ll figure out some Solomonic procedure for choosing the recipient(s). If there are no takers, I guess I’ll use it for its intended purpose.

I must also note that my hypothesis about the special non-nesting property of the embossed paper is totally bogus. In the first place, a roll of toilet paper is an Archimedian spiral, so that the circumference increases from one layer to the next; even a perfectly regular pattern will come into coincidence with itself only when the circumference equals an integer multiple of the pattern period. Second, the texture imprinted on the toilet paper is surely not a real aperiodic tiling. The manufacturing process would have involved passing the sheet between a pair of steel crimping cylinders bearing the incised network of kites and darts. Those cylinders are necessarily of finite diameter, and so the pattern must in fact repeat. If Kimberly-Clark had contested the law suit, they might have used that point in their defense.

Posted in mathematics, off-topic, uncategorized | 1 Comment

The Threats to the Net

My first glimpse of the World Wide Web came in 1993 on a visit to Fermilab, the physics playground near Chicago. Tom Nash, head of the computing division, showed me a screenful of text with a few highlighted phrases. When he selected one of the phrases, the screen went blank for a moment, and then another page of text appeared. We had just followed a hyperlink. I asked Tom what the system was good for, and he said it was great for sharing software documentation. I was so unimpressed I failed even to mention this new tool in the article I was writing about scientific computing at Fermilab.

A year later, after the Mosaic browser came on the scene, my eyes were opened. I wrote a gushing article on the marvels of the WWW.

There have long been protocols for transferring various kinds of information over the Internet, but the Web offers the first seamless interface to the entire network . . . The Web promotes the illusion that all resources are at your fingertips; the universe of information is inside the little box that sits on your desk.

I was still missing half the story. Yes, the web (which has since lost its capital W) opened up an amazing portal onto humanity’s accumulated storehouse of knowledge. But it did something else as well: It empowered all of us to put our own stories and ideas before the public. Economic and technological barriers were swept away; we could all become creators as well as consumers. Perhaps for the first time since Gutenberg, public communication became a reasonably symmetrical, two-way social process.

The miracle of the web is not just that the technology exists, but that it’s accessible to much of the world’s population. The entire software infrastructure is freely available, including the HTTP protocol that started it all, the languages for markup, styling, and scripting (HTML, CSS, JavaScript), server software (Apache, Nginx), content-management systems such as WordPress, and also editors, debuggers, and other development tools. Thanks to this community effort, I get to have my own little broadcasting station, my personal media empire.

But can it last?

In the U.S., the immediate threat to the web is the repeal of net-neutrality regulations. Under the new rules (or non-rules), Internet service providers will be allowed to set up toll booths and roadblocks, fast lanes and slow lanes. They will be able to expedite content from favored sources (perhaps their own affiliates) and impede or block other kinds of traffic. They could charge consumers extra fees for access to some sites, or collect back-channel payments from publishers who want preferential treatment. For a glimpse of what might be in store, a New York Times article looks at some recent developments in Europe. (The European Union has its own net-neutrality law, but apparently it’s not being consistently enforced.)

The loss of net neutrality has elicited much wringing of hands and gnashing of teeth. I’m as annoyed as the next netizen. But I also think it’s important to keep in mind that the web (along with the internet more generally) has always lived at the edge of the precipice. Losing net neutrality will further erode the foundations, but it is not the only threat, and probably not the worst one.

Need I point out that the internet lost its innocence a long time ago? In the early years, when the network was entirely funded by the federal government, most commercial activity was forbidden. That began to change circa 1990, when crosslinks with private-enterprise networks were put in place, and the general public found ways to get online through dial-up links. The broadening of access did not please everyone. Internet insiders recoiled at the onslaught of clueless newbies (like me); commercial network operators such as CompuServe and AmericaOnline feared that their customers would be lured away by a heavily subsidized competitor. Both sides were right about the outcome.

As late as 1994, hucksterism on the internet was still a social trangression if not a legal one. Advertising, in particular, was punished by vigorous and vocal vigilante action. But the cause was already lost. The insular, nerdy community of internet adepts was soon overwhelmed by the dot-com boom. Advertising, of course, is now the engine that drives most of the largest websites.

Commerce also intruded at a deeper level in the stack of internet technologies. When the internet first became inter—a network of networks—bits moved freely from one system to another through an arrangement called peering, in which no money changed hands. By the late 1990s, however, peering was reserved for true peers—for networks of roughly the same size. Smaller carriers, such as local ISPs, had to pay to connect to the network backbone. These pay-to-play arrangements were never affected by network neutrality rules.

Summer Street patch panel 3904

A patch panel in a “meet me room” allows independent nework carriers to exchange streams of bits. Some of the data transfers are peering arrangements, made without payment, but others are cash transactions. The meet-me room is at the Summer Street internet switching center in Boston.

Express lanes and tolls are also not a novelty on the internet. Netflix, for example, pays to place disk farms full of videos at strategic internet nodes around the world, reducing both transit time and network congestion. And Google has built its own private data highways, laying thousands of miles of fiber optic cable to bypass the major backbone carriers. If you’re not Netflix or Google, and you can’t quite afford to build your own global distribution system, you can hire a content delivery network (CDN) such as Akamai or Cloudflare to do it for you. What you get for your money: speedier delivery, caching of static content near the destination, and some protection against malicious traffic. Again the network neutrality rules do not apply to CDNs, even when they are owned and run by companies that also act as telecommunications carriers and ISPs, such as AT&T.

In pointing out that there’s already a lot of money grubbing in the temple of the internet, I don’t mean to suggest that the repeal of net neutrality doesn’t matter or won’t make a difference. It’s a stupid decision. As a consumer, I dread the prospect of buying internet service the way one buys bundles of cable TV channels. As a creator of websites, I fear losing affordable access to readers. As a citizen, I denounce the reckless endangerment of a valuable civic asset. This is nothing but muddy boots trampling a cultural treasure.

Still and all, it could be worse. Most likely it will be. Here are three developments that make me uneasy about the future of the web.

Dominance. In round numbers, the web has something like a billion sites and four billion users—an extraordinarily close match of producers to consumers. For any other modern medium—television stations and their viewers, newspaper and their readers—the ratio is surely orders of magnitude larger. Yet the ratio for the web is also misleading. Three fourths of those billion web sites have no content and no audience (they are “parked” domain names), and almost all the rest are tiny. Meanwhile, Facebook gets the attention of roughly half of the four billion web users. Google and Facebook together, along with their subsidiaries such as YouTube, account for 70 percent of all internet traffic. The wealth distribution of the web is even more skewed than that of the world economy.

It’s not just the scale of the few large sites that I find intimidating. Facebook in particular seems eager not just to dominate the web but to supplant it. They make an offer to the consumer: We’ll give you a better internet, a curated experience; we’ll show you what you want to see and filter out the crap. And they make an offer to the publisher and advertiser: This is where the people are. If you want to reach them, buy a ticket and join the party.

If everyone follows the same trail to the same few destinations, net neutrality is meaningless.

Fragmentation. The web is built on open standards and a philosophy of sharing and cooperation. If I put up a public website, anyone can visit without asking my permission; they can use whatever software they please when they read my pages; they can publish links to what I’ve written, which any other web user can then follow. This crosslinked body of literature is now being shattered by the rise of apps. Facebook and Twitter and Google and other large internet properties would really prefer that you visit them not on the open web but via their own proprietary software. And no wonder: They can hold you captive in an environment where you can’t wander away to other sites; they can prevent you from blocking advertising or otherwise fiddling with what they feed you; and they can gather more information about you than they could from a generic web browser. The trouble is, when every website requires its own app, there’s no longer a web, just a sheaf of disconnected threads.

This battle seems to be lost already on mobile platforms.

Suppression. All of the challenges to the future of the web that I have mentioned so far are driven by the mere pursuit of money. Far scarier are forms of manipulation and discrimination based on noneconomic motives.

Governments have ultimate control over virtually all communications media—radio and TV, newspapers, books, movies, the telephone system, the postal service, and certainly the internet. Nations that we like to think of as enlightened have not hesitated to use that power to shape public discourse or to suppress unpopular or inconvenient opinions, particularly in times of stress. With internet technology, surveillance and censorship are far easier and more efficient than they ever were with earlier media. A number of countries (most notoriously China) have taken full advantage of those capabilities. Others could follow their example. Controls might be introduced overtly through legislation or imposed surreptitiously through hacking or by coercing service providers.

Still another avenue of suppression is inciting popular sentiment—burning down websites with tiki torches. I can’t say I’m sorry to see the Nazi site Daily Stormer hounded from the web by public outcry; no one, it seems, will register their domain name or host their content. Historically, however, this kind of intimidation has weighed most heavily on the other end of the political spectrum. It is the labor movement, racial and ethnic and religious minorities, socialists and communists and anarchists, feminists, and the LGBT community who have most often had their speech suppressed. Considering who wields power in Washington just now, a crackdown on “fake news” on the internet is hardly an outlandish possibility.

In spite of all these forebodings, I remain strangely optimistic about the web’s prospects for survival. The internet is a resilient structure, not just in its technological underpinnings but also in its social organization. Over the past 20 years, for many of us, the net has wormed its way into every aspect of daily life. It’s too big to fail now. Even if some basement command center in the White House had a big red switch that shuts down the whole network, no one would dare to throw it.

Posted in computing, modern life | 3 Comments

Sudden Deaf

My erstwhile employer, mentor, and dearest friend was Dennis Flanagan, who edited Scientific American for 37 years. He is the larger of the two aquatic specimens in the photograph below.

Dennis Flanagan in a wet suit, lying on the lawn next to the striped bass he just speared in Great South Bay.

One of the quirks of life with Dennis was that he didn’t hear well, as a result of childhood ear infections. In an unpublished memoir he lists his deafness as a major influence on his path through life. It was a hardship in school, because he missed much of what his teachers were saying. On the other hand, it kept him out of the military in World War II.

Later in life, hearing aids helped considerably, but only on one side. When we went to lunch, I learned to sit to his right, so that I could speak to the better ear. When we took someone out to lunch, the guest got the favored chair. In our monthly editorial meetings, however, he turned his deaf ear to Gerard Piel, the magazine’s co-founder and publisher. (They didn’t always get along.) In Dennis’s last years, after both of us had left the magazine, we would take long walks through Lower Manhattan, with stops in coffee shops and sojourns on park benches, and again I made sure I was the right-hand man. Dennis died in 2005. I miss him all the time.

Although I was always aware of Dennis’s hearing impairment, I never had an inkling of what his asymmetric sensory experience might feel like from inside his head. Now I have a chance to find out. A few days ago I had a sudden failure of hearing in my left ear. At the time I had no idea what was happening, so I can’t reconstruct an exact chronology, but I think the ear went from normal function to zilch in a matter of seconds or minutes. It was like somebody pulled the plug.

I have since learned that this is a rare phenomenon (5 to 20 cases per 100,000 population) but well-known to the medical community. It has a name: Sudden Sensorineural Hearing Loss. It is a malfunction of the cochlea, the inner-ear transducer between mechanical vibration and neural activity. An audiological exam confirmed that my eardrum and the delicate linkage of tiny bones in the middle ear are functioning normally, but the signal is not getting through to the brain. In most cases of SSNH, the cause is never identified. I’m under treatment, and there’s a decent chance that at least some level of hearing will be restored.

I don’t often write about matters this personal, and I’m not doing so now to whine about my fate or to elicit sympathy. I want to record what I’m going through because I find it fascinating as well as distressing. A great deal of what we know about the human brain comes from accidents and malfunctions, and now I’m learning some interesting lessons at first hand.

The obvious first-order effect of losing an ear is cutting in half the amplitude of the received acoustic signal. This is perhaps the least disruptive aspect of the impairment, and the easiest to mitigate.

The second major effect is more disturbing: trouble locating the source of a sound. Binaural hearing is key to localization. For low-pitched sounds, with wavelengths greater than the diameter of the head, the brain detects the phase difference between waves reaching the two ears. The phase measurement can yield an angular resolution of just a few degrees. At higher frequencies and shorter wavelengths, the head effectly blocks sound, and so there is a large intensity difference between the two ears, which provides another localizing cue. This mechanism is somewhat less acurate, but you can home in on a source by turning your head to null the intensity difference.

With just one ear, both kinds of directional guidance are lacking. This did not come as a surprise to me, but I had never thought about what it would be like to perceive nonlocalized sounds. You might imagine it would be like switching the audio system from stereophonic to monoaural. In that case, you lose the illusion that the strings are on the left side of the stage and the brasses on the right; the whole orchestra is all mixed up in front of you. Nevertheless, in your head you are still localizing the sounds; they are all coming from the speakers across the room. Having one ear is not like that; it’s not just life in mono.

In my present state I can’t identify the sources of many sounds, but they don’t come from nowhere. Some of them come from everywhere. The drone of the refrigerator surrounds me; I hear it radiating from all four walls and the floor and ceiling; it’s as if I’m somehow inside the sound. And one night there was a repetitive thrub-a-dub that puzzled me so much I had to get out of bed and go searching for the cause. The search was essentially a random one: I determined it was not the heating system, and nothing in the kitchen or bathroom. Finally I discovered that the noise was rain pouring off the roof into the gutters and downspouts.

The failures of localization are most disturbing when the apparent source is not vague or unknown but rather quite definite—and wrong! My phone rings, and I reach out to my right to pick it up, but in fact it’s in my shirt pocket. While driving the other day, I heard the whoosh of a car that seemed to be passing me on the right, along the shoulder of the road. I almost veered left to make room. If I had done so, I would have run into the overtaking vehicle, which was of course actually on my left. (Urgent priority: Learn to ignore deceptive directional cues.)

In the first hour or so after this whole episode began, I did not recognize it as a loss of hearing; what I noticed instead was a distracting barrage of echoes. I was chatting with three other people in a room that has always seemed acoustically normal, but words were coming at me from all directions like high-velocity ping-pong balls. The echoes have faded a little in the days since, but I still hear double in some situations. And, interestingly, the echo often seems to be coming from the nonfunctioning ear. I have a hypothesis about what’s going on. Echoes are real, after all; sounds really do bounce off walls, so that the ears receive multiple instances of a sound separated by millisecond delays. Normally, we don’t perceive those echoes. The ears must be sensing them, but some circuitry in the brain is suppressing the perception. (Telephone systems have such circuitry too.) Based on my experience, I suspect that the suppression mechanism depends on the presence of signals from both ears.

Similar to echo suppression is noise suppression. I find I have lost the benefit of the “cocktail party effect,” whereby we select a single voice to attend to and filter out the background chatter. The truth is, I was never very good at that trick, but I’m notably worse now. A possibly related development is that I have the illusion of enhanced hearing acuity for some kinds of noise. The sound of water running from a faucet carries all through the house now. And the sound of my own chewing can be thunderous. In the past, perhaps the binaural screening process was turning down the gain on such commonplace distractions.

Even though no sounds of the outside world are reaching me from the left side of my head, that doesn’t mean the ear is silent. It seems to emit a steady hiss, which I’m told is common in this condition. Occasionally, in a very quiet room, I also hear faint chimes of pure sine tones. Do any of these signals actually originate in the affected cochlea, or are they phantoms that the brain merely attributes to that source?

The most curious interior noise is one that I’ve taken to calling the motor. In the still of the night, if I turn my head a certain way, I hear a putt-putt-putt with the rhythm of a sputtering lawn-mower engine, though very faint and voiceless. The intriguing thing is, the sound is altered by my breathing. If I hold my breath for a few seconds, the putt-putting slows and sometimes stops entirely. Then when I take a breath, the motor revs up again. Could this response indicate sensitivity to oxygen levels in the blood reaching my head? I like to imagine that the source of the noise is a single lonely neuron in the cochlea, bravely tapping out its spike train—the last little drummer boy in my left ear. But I wouldn’t be surprised to learn it comes from somewhere higher up in the auditory pathway.

One of the first manuscripts I edited at Scientific American (published in October 1973) was an article by the polymath Gerald Oster.

Oster title 1280x501

Ordinary beat tones are elementary physics: Whenever two waves combine and interfere, they create a new wave whose frequency is equal to the difference between the two original frequencies. In the case of sound waves at frequencies at few hertz apart, we perceive the beat tone as a throbbing modulation of the sound intensity. Oster asked what happens when the waves are not allowed to combine and interfere but instead are presented separately to the two ears. In certain frequency ranges it turns out that most people still hear the beats; evidently they are generated by some interference process within the auditory networks of the brain. Oster suggested that a likely site is the superior olivary nucleus. There are two of these bodies arrayed symmetrically just to the left and right of the midline in the back of the brain. They both receive signals from both ears.

Whatever the mechanism generating the binaural beats, it has to be happening somewhere inside the head. It’s a dramatic reminder that perception is not a passive process. We don’t really see and hear the world; we fabricate a model of it based on the sensations we receive—or fail to receive.

I’m hopeful that this little experiment of nature going on inside my cranium will soon end, but if it turns out to be a permanent condition, I’ll cope. As it happens, my listening skills will be put to the test over the next several months, as I’m going to be spending a lot of time in lecture halls. There’s the annual Joint Mathematics Meeting coming up in early January, then I’m spending the rest of the spring semester at the Simons Institute for the Theory of Computing in Berkeley. Lots of talks to attend. You’ll find me in the front of the room, to the left of the speaker.

My years with Dennis Flanagan offer much comfort when I consider the prospect of being half-deaf. His deficit was more severe than mine, and he put up with it from childhood. It never held him back—not from creating one of the world’s great magazines, not from leading several organizations, not from traveling the world, not from spearing a 40-pound bass while free diving in Great South Bay.

One worry I face is music—will I ever be able to enjoy it again?—but Dennis’s example again offers encouragement. We shared a great fondness for Schubert. I can’t know exactly what Dennis was hearing when we listened to a performance of the Trout Quintet together, but he got as much pleasure out of it as I did. And in his sixties he went beyond appreciation to performance. He had wanted to learn the cello, but a musician friend advised him to take up the brass instrument of the same register. He did so, and promptly learned to play a Bach suite for unaccompanied cello on the slide trombone.

Posted in biology, off-topic | 4 Comments

Approximately Yours

Today, I’m told, is Rational Approximation Day. It’s 22/7 (for those who write dates in little-endian format), which differs from π by about 0.04 percent. (The big-endians among us are welcome to approximate 1/π.)

Given the present state of life in America, what we really need is an Approximation to Rationality Day, but that may have to wait for 20/1/21. In the meantime, let us merrily fiddle with numbers, searching for ratios of integers that brazenly invade the personal space of famous irrationals.

When I was a teenager, somebody told me about the number 355/113, which is an exceptionally good approximation to π. The exact value is


correct through the first six digits after the decimal point. In other words, it differs from the true value by less than one-millionth. I was intrigued, and so I set out to find an even better approximation. My search was necessarily a pencil-and-paper affair, since I had no access to any electronic or even mechanical aids to computation. The spiral-bound notebook in which I made my calculations has not survived, and I remember nothing about the outcome of the effort.

A dozen years later I acquired some computing machinery: a Hewlett-Packard programmable calculator, called the HP-41C. Here is the main loop of an HP-41C program that searches for good rational approximations. Note the date at the top of the printout (written in middle-endian format). Apparently I was finishing up this program just before Approximation Day in 1981.

HP 41C program main loop

What’s that you say? You’re not fluent in the 30-year-old Hewlett-Packard dialect of reverse Polish notation? All right, here’s a program that does roughly the same thing, written in an oh-so-modern language, Julia.

function approximate(T, dmax)
    d = 1
    leastError = T
    while d <= dmax && leastError > 0
        n = Int(round(d * T))
        err = abs(T - n/d) / T
        merit = 1 / ((n + d)^2 * err)
        if err < leastError
            println("$n/$d = $(n/d)  error = $err  merit = $merit")
            leastError = err
        d += 1

The algorithm is a naive, sequential search for fractions \(n/d\) that approximate the target number \(T\). For each value of \(d\), you need to consider only one value of \(n\), namely the integer nearest to \(d \times T\). (What happens if \(d \times T\) falls halfway between two integers? That can’t happen if \(T\) is irrational.) Thus you can begin with \(d = 1\) and continue up to a specified largest denominator \(d = dmax\). The accuracy of the approximation is measured by the error term \(|T - n/d| / T\). Whenever a value of \(n/d\) yields a new minimum error, the program prints a line of results. (This version of the algorithm works correctly only for \(T \gt 1\), but it can readily be adapted to \(T \lt 1\).)

The HP-41C has a numerical precision of 10 decimal digits, and so the closest possible approximation to π is 3.141592654. Back in 1981 I ran the program until it found a fraction equal to this value—a perfect approximation, from the program’s point of view. According to a note on the printout, that took 13 hours. The Julia program above, running on a laptop, completes the same computation in about three milliseconds. You’re welcome to take a scroll through the results, below. (The numbers are not digit-for-digit identical to those generated by the HP-41C because Julia calculates with higher precision, about 16 decimal digits.)

     3/1     = 3.0                 error = 0.045070341573315915    merit =  1.3867212410256813
    13/4     = 3.25                error = 0.03450712996224109     merit =  0.10027514940370374
    16/5     = 3.2                 error = 0.018591635655129744    merit =  0.12196741256165179
    19/6     = 3.1666666666666665  error = 0.007981306117055373    merit =  0.20046844169789904
    22/7     = 3.142857142857143   error = 0.0004024993041452083   merit =  2.9541930379680195
   179/57    = 3.1403508771929824  error = 0.00039526983405584675  merit =  0.04542368072920613
   201/64    = 3.140625            error = 0.0003080138345651019   merit =  0.04623150469956595
   223/71    = 3.140845070422535   error = 0.00023796324342470652  merit =  0.04861781754719378
   245/78    = 3.141025641025641   error = 0.0001804858353094197   merit =  0.053107007660473673
   267/85    = 3.1411764705882352  error = 0.00013247529441315622  merit =  0.060922789404334425
   289/92    = 3.141304347826087   error = 9.177070539240495e-5    merit =  0.07506646742266793
   311/99    = 3.1414141414141414  error = 5.6822320879624425e-5   merit =  0.10469195703580983
   333/106   = 3.141509433962264   error = 2.6489760736525772e-5   merit =  0.19588127575835135
   355/113   = 3.1415929203539825  error = 8.478310581938076e-8    merit = 53.85164473263654
 52518/16717 = 3.1415923909792425  error = 8.37221074104896e-8     merit =  0.00249177288308447
 52873/16830 = 3.141592394533571   error = 8.259072954625822e-8    merit =  0.0024921016732136797
 53228/16943 = 3.1415923980404887  error = 8.147444291923546e-8    merit =  0.0024926612882136163
 53583/17056 = 3.141592401500938   error = 8.03729477091334e-8     merit =  0.0024934520351304946
 53938/17169 = 3.1415924049158366  error = 7.928595172899531e-8    merit =  0.0024944743578840687
 54293/17282 = 3.141592408286078   error = 7.821317056655376e-8    merit =  0.0024957288257085445
 54648/17395 = 3.141592411612532   error = 7.715432730151448e-8    merit =  0.002497216134767719
 55003/17508 = 3.1415924148960475  error = 7.610915194012454e-8    merit =  0.0024989371196291283
 55358/17621 = 3.1415924181374497  error = 7.507738155653036e-8    merit =  0.0025008927426067996
 55713/17734 = 3.1415924213375437  error = 7.405876001006156e-8    merit =  0.0025030840968725283
 56068/17847 = 3.1415924244971145  error = 7.305303737979925e-8    merit =  0.002505512419906649
 56423/17960 = 3.1415924276169265  error = 7.20599703886498e-8     merit =  0.002508179074048983
 56778/18073 = 3.141592430697726   error = 7.107932141383905e-8    merit =  0.0025110855755419263
 57133/18186 = 3.14159243374024    error = 7.01108591937022e-8     merit =  0.002514233565685482
 57488/18299 = 3.1415924367451775  error = 6.915435783817789e-8    merit =  0.0025176248413626597
 57843/18412 = 3.1415924397132304  error = 6.820959725288218e-8    merit =  0.0025212613363967255
 58198/18525 = 3.141592442645074   error = 6.727636243231866e-8    merit =  0.002525145143834103
 58553/18638 = 3.141592445541367   error = 6.635444374259433e-8    merit =  0.0025292785028112976
 58908/18751 = 3.141592448402752   error = 6.544363663870371e-8    merit =  0.0025336638062423296
 59263/18864 = 3.141592451229856   error = 6.454374152317083e-8    merit =  0.002538303603848205
 59618/18977 = 3.1415924540232916  error = 6.365456332197522e-8    merit =  0.002543200616913158
 59973/19090 = 3.1415924567836564  error = 6.277591190862598e-8    merit =  0.002548357720152209
 60328/19203 = 3.1415924595115348  error = 6.190760125601375e-8    merit =  0.0025537779743748956
 60683/19316 = 3.1415924622074964  error = 6.10494500018427e-8     merit =  0.0025594646031786867
 61038/19429 = 3.1415924648720983  error = 6.020128088319864e-8    merit =  0.002565421015548036
 61393/19542 = 3.141592467505885   error = 5.936292059519092e-8    merit =  0.0025716508123781218
 61748/19655 = 3.141592470109387   error = 5.853420007366852e-8    merit =  0.0025781577749599853
 62103/19768 = 3.1415924726831244  error = 5.771495407114599e-8    merit =  0.002584945883912429
 62458/19881 = 3.141592475227604   error = 5.690502101544554e-8    merit =  0.002592019327133724
 62813/19994 = 3.141592477743323   error = 5.6104242868339024e-8   merit =  0.0025993825084809985
 63168/20107 = 3.1415924802307655  error = 5.531246526690591e-8    merit =  0.0026070400439016164
 63523/20220 = 3.1415924826904056  error = 5.4529537523533324e-8   merit =  0.0026149967637792084
 63878/20333 = 3.141592485122707   error = 5.375531191912607e-8    merit =  0.002623257749852838
 64233/20446 = 3.141592487528123   error = 5.2989644268538606e-8   merit =  0.0026318283126966317
 64588/20559 = 3.141592489907097   error = 5.22323933551431e-8     merit =  0.0026407140236596287
 64943/20672 = 3.1415924922600618  error = 5.148342135490336e-8    merit =  0.002649920699086574
 65298/20785 = 3.1415924945874427  error = 5.0742592988226976e-8   merit =  0.002659454449139831
 65653/20898 = 3.1415924968896545  error = 5.0009776226755164e-8   merit =  0.0026693216486930156
 66008/21011 = 3.141592499167103   error = 4.928484186928889e-8    merit =  0.002679528965991537
 66363/21124 = 3.1415925014201855  error = 4.8567663400430846e-8   merit =  0.0026900833784673454
 66718/21237 = 3.1415925036492913  error = 4.7858116990585446e-8   merit =  0.0027009921818650063
 67073/21350 = 3.141592505854801   error = 4.715608149595883e-8    merit =  0.0027122629998437182
 67428/21463 = 3.1415925080370872  error = 4.6461438175842924e-8   merit =  0.002723903810648984
 67783/21576 = 3.1415925101965145  error = 4.577407111668933e-8    merit =  0.002735922933992634
 68138/21689 = 3.1415925123334407  error = 4.5093866383961494e-8   merit =  0.0027483290931549346
 68493/21802 = 3.1415925144482157  error = 4.442071258756658e-8    merit =  0.002761131395876878
 68848/21915 = 3.141592516541182   error = 4.375450074049751e-8    merit =  0.002774339356802981
 69203/22028 = 3.1415925186126747  error = 4.309512411747499e-8    merit =  0.0027879629217230834
 69558/22141 = 3.1415925206630235  error = 4.244247783087354e-8    merit =  0.002802012512429091
 69913/22254 = 3.14159252269255    error = 4.179645953751142e-8    merit =  0.0028164989998024
 70268/22367 = 3.1415925247015695  error = 4.115696873186072e-8    merit =  0.0028314337694556623
 70623/22480 = 3.1415925266903915  error = 4.0523907028763286e-8   merit =  0.002846828724926181
 70978/22593 = 3.141592528659319   error = 3.989717788071482e-8    merit =  0.00286269633032941
 71333/22706 = 3.1415925306086496  error = 3.9276686719222797e-8   merit =  0.0028790496258831624
 71688/22819 = 3.1415925325386738  error = 3.86623409548065e-8     merit =  0.0028959022542887716
 72043/22932 = 3.141592534449677   error = 3.805404969428105e-8    merit =  0.0029132685103826087
 72398/23045 = 3.1415925363419395  error = 3.7451723882115376e-8   merit =  0.0029311633622333107
 72753/23158 = 3.1415925382157353  error = 3.685527615907423e-8    merit =  0.002949602495467867
 73108/23271 = 3.1415925400713336  error = 3.626462086221821e-8    merit =  0.002968602349703417
 73463/23384 = 3.1415925419089974  error = 3.567967430761971e-8    merit =  0.002988180133716996
 73818/23497 = 3.141592543728987   error = 3.510035365949903e-8    merit =  0.003008353961046636
 74173/23610 = 3.1415925455315543  error = 3.452657862652023e-8    merit =  0.003029142753805288
 74528/23723 = 3.1415925473169497  error = 3.395826962413729e-8    merit =  0.0030505664465106676
 74883/23836 = 3.141592549085417   error = 3.339534904681598e-8    merit =  0.0030726459300795604
 75238/23949 = 3.1415925508371956  error = 3.283774056124397e-8    merit =  0.003095403169820992
 75593/24062 = 3.141592552572521   error = 3.228536938904675e-8    merit =  0.0031188612412389144
 75948/24175 = 3.1415925542916234  error = 3.173816202407169e-8    merit =  0.0031430444223940115
 76303/24288 = 3.14159255599473    error = 3.1196046373746034e-8   merit =  0.0031679782521683033
 76658/24401 = 3.141592557682062   error = 3.065895190043484e-8    merit =  0.0031936895918127546
 77013/24514 = 3.1415925593538385  error = 3.01268089146511e-8     merit =  0.0032202067806171002
 77368/24627 = 3.141592561010273   error = 2.9599549423203633e-8   merit =  0.003247559639023363
 77723/24740 = 3.1415925626515766  error = 2.9077106281049175e-8   merit =  0.0032757796556622983
 78078/24853 = 3.1415925642779543  error = 2.8559414180798277e-8   merit =  0.0033048999843237645
 78433/24966 = 3.14159256588961    error = 2.804640823913544e-8    merit =  0.003334955716987436
 78788/25079 = 3.1415925674867418  error = 2.753802541039899e-8    merit =  0.0033659838476231357
 79143/25192 = 3.141592569069546   error = 2.703420321435919e-8    merit =  0.003398023556100075
 79498/25305 = 3.1415925706382137  error = 2.6534880725724155e-8   merit =  0.0034311162371627422
 79853/25418 = 3.141592572192934   error = 2.6039997867349902e-8   merit =  0.0034653057466538235
 80208/25531 = 3.141592573733892   error = 2.554949569295635e-8    merit =  0.0035006385417218717
 80563/25644 = 3.14159257526127    error = 2.5063316245769302e-8   merit =  0.0035371638899188347
 80918/25757 = 3.1415925767752455  error = 2.4581402841236452e-8   merit =  0.0035749340371894456
 81273/25870 = 3.1415925782759953  error = 2.410369936023742e-8    merit =  0.003614004535709633
 81628/25983 = 3.1415925797636914  error = 2.3630150955873712e-8   merit =  0.00365443439340209
 81983/26096 = 3.141592581238504   error = 2.3160703488036753e-8   merit =  0.00369628643041249
 82338/26209 = 3.141592582700599   error = 2.2695304230197833e-8   merit =  0.003739627468693587
 82693/26322 = 3.1415925841501404  error = 2.2233900879902193e-8   merit =  0.0037845288174018898
 83048/26435 = 3.1415925855872895  error = 2.1776442124200985e-8   merit =  0.0038310665494126084
 83403/26548 = 3.1415925870122043  error = 2.1322877639651253e-8   merit =  0.00387932189896066
 83758/26661 = 3.1415925884250404  error = 2.087315795095796e-8    merit =  0.003929381726572982
 84113/26774 = 3.1415925898259505  error = 2.0427234430973973e-8   merit =  0.003981339007706688
 84468/26887 = 3.1415925912150855  error = 1.9985059017984126e-8   merit =  0.004035293430477111
 84823/27000 = 3.1415925925925925  error = 1.9546584922495102e-8   merit =  0.004091351857390988
 85178/27113 = 3.1415925939586176  error = 1.9111765637729565e-8   merit =  0.004149629190123568
 85533/27226 = 3.1415925953133033  error = 1.868055578777407e-8    merit =  0.004210248941258058
 85888/27339 = 3.141592596656791   error = 1.825291042078912e-8    merit =  0.004273344214343279
 86243/27452 = 3.1415925979892174  error = 1.78287858571571e-8     merit =  0.004339058439193095
 86598/27565 = 3.1415925993107203  error = 1.7408138417260385e-8   merit =  0.004407546707464268
 86953/27678 = 3.1415926006214323  error = 1.6990925835061217e-8   merit =  0.004478976601684539
 87308/27791 = 3.1415926019214853  error = 1.6577106127237806e-8   merit =  0.004553529781140699
 87663/27904 = 3.1415926032110093  error = 1.6166637875900305e-8   merit =  0.004631403402447433
 88018/28017 = 3.141592604490131   error = 1.5759480794022753e-8   merit =  0.0047128116308472546
 88373/28130 = 3.141592605758976   error = 1.5355594877295166e-8   merit =  0.004797987771392931
 88728/28243 = 3.1415926070176683  error = 1.4954940686839493e-8   merit =  0.0048871863549194705
 89083/28356 = 3.141592608266328   error = 1.4557479914641577e-8   merit =  0.004980685405908598
 89438/28469 = 3.141592609505076   error = 1.4163174252687263e-8   merit =  0.005078789613658918
 89793/28582 = 3.1415926107340284  error = 1.3771986523826276e-8   merit =  0.005181833172630217
 90148/28695 = 3.141592611953302   error = 1.338387969226633e-8    merit =  0.005290183824183623
 90503/28808 = 3.1415926131630103  error = 1.2998817570363058e-8   merit =  0.005404246870669908
 90858/28921 = 3.1415926143632653  error = 1.2616764535904027e-8   merit =  0.005524470210563737
 91213/29034 = 3.141592615554178   error = 1.2237685249392783e-8   merit =  0.005651350205744754
 91568/29147 = 3.1415926167358563  error = 1.1861545360838771e-8   merit =  0.005785438063205309
 91923/29260 = 3.1415926179084073  error = 1.1488310802967408e-8   merit =  0.005927347979056494
 92278/29373 = 3.141592619071937   error = 1.111794779122008e-8    merit =  0.006077766389438445
 92633/29486 = 3.141592620226548   error = 1.0750423671902066e-8   merit =  0.006237462409303776
 92988/29599 = 3.1415926213723435  error = 1.0385705649960649e-8   merit =  0.006407301439430316
 93343/29712 = 3.1415926225094237  error = 1.0023761778491034e-8   merit =  0.00658826005755035
 93698/29825 = 3.1415926236378877  error = 9.664560534662385e-9    merit =  0.006781444748602359
 94053/29938 = 3.1415926247578327  error = 9.308070961075804e-9    merit =  0.006988114128701429
 94408/30051 = 3.1415926258693556  error = 8.954262241690382e-9    merit =  0.007209706348604964
 94763/30164 = 3.14159262697255    error = 8.603104549971112e-9    merit =  0.007447871540046976
 95118/30277 = 3.14159262806751    error = 8.254567918024995e-9    merit =  0.007704513406469473
 95473/30390 = 3.141592629154327   error = 7.90862336746494e-9     merit =  0.007981838717667477
 95828/30503 = 3.1415926302330917  error = 7.565241919903853e-9    merit =  0.008282421184374838
 96183/30616 = 3.1415926313038933  error = 7.224395162386583e-9    merit =  0.008609280341750632
 96538/30729 = 3.1415926323668195  error = 6.8860552473899216e-9   merit =  0.008965982432171553
 96893/30842 = 3.1415926334219573  error = 6.550194468748648e-9    merit =  0.009356770586561815
 97248/30955 = 3.141592634469391   error = 6.216785968445456e-9    merit =  0.009786732283709331
 97603/31068 = 3.1415926355092054  error = 5.885802747105052e-9    merit =  0.010262022067809991
 97958/31181 = 3.1415926365414837  error = 5.557218370784088e-9    merit =  0.010790155391967196
 98313/31294 = 3.1415926375663066  error = 5.231007112329143e-9    merit =  0.011380406450991833
 98668/31407 = 3.1415926385837554  error = 4.907143103228812e-9    merit =  0.012044356667029002
 99023/31520 = 3.1415926395939087  error = 4.585601323119603e-9    merit =  0.01279665696194468
 99378/31633 = 3.141592640596845   error = 4.266356751638026e-9    merit =  0.013656119502875172
 99733/31746 = 3.1415926415926414  error = 3.9493849338525334e-9   merit =  0.014647305857352692
100088/31859 = 3.141592642581374   error = 3.6346615561895634e-9   merit =  0.015802906908552822
100443/31972 = 3.1415926435631176  error = 3.322162870507497e-9    merit =  0.017167407267272748
100798/32085 = 3.1415926445379463  error = 3.0118652700227016e-9   merit =  0.018802933529623964
101153/32198 = 3.141592645505932   error = 2.703745854741474e-9    merit =  0.020798958087527405
101508/32311 = 3.1415926464671475  error = 2.397781441954139e-9    merit =  0.02328921472604781
101863/32424 = 3.141592647421663   error = 2.0939496970989362e-9   merit =  0.026482916558483883
102218/32537 = 3.1415926483695484  error = 1.7922284269720909e-9   merit =  0.03072676661447583
102573/32650 = 3.141592649310873   error = 1.492595579727815e-9    merit =  0.03664010548445531
102928/32763 = 3.141592650245704   error = 1.195029527594277e-9    merit =  0.04544847105306477
103283/32876 = 3.1415926511741086  error = 8.995092082315892e-10   merit =  0.05996553050516452
103638/32989 = 3.1415926520961532  error = 6.060132765838922e-10   merit =  0.0883984797913258
103993/33102 = 3.1415926530119025  error = 3.1452123574324146e-10  merit =  0.16916355170353897
104348/33215 = 3.141592653921421   error = 2.5012447443706518e-11  merit =  2.1127131430431656

The error values in the middle column of the table above shrink steadily as you read from the top of the list to the bottom. Each successive approximation is more accurate than all those above it. Does that also mean each successive approximation is better than those above it? I would say no. Any reasonable notion of “better” in this context has to take into account the size of the numerator and the denominator.

If you want an approximation of \(\pi\) accurate to seven digits, I can give you one off the top of my head: \(3141593/1000000\). But the numbers making up that ratio are themselves seven digits long. What makes \(355/113\) impressive is that it achieves seven-digit accuracy with only three digits in the numerator and the denominator. Accordingly, I would argue that a “better” approximation is one that minimizes both error and size. The rightmost column of the table, filled with numbers labeled “merit” is meant to quantify this intuition.

When I wrote that program in 1981, I chose a strange formula for merit, one that now baffles me:

\[\frac{1}{(n + d)^2 * err}.\]

Adding the numerator and denominator and then squaring the sum is an operation that makes no sense, although the formula as a whole does have the correct qualitative behavior, favoring both smaller errors and smaller values of \(n\) and \(d\). In trying to reconstruct what I had in mind 26 years ago, my best guess is that I was trying to capture a geometric insight, and I flubbed it when translating math into code. On this assumption, the correct figure of merit would be:

\[\frac{1}{\sqrt{n^2 + d^2} * err}.\]

To see where this formula comes from, consider a two-dimensional lattice of integers, with a ray of slope \(\pi\) drawn from the origin and going on to infinite distance.

Lattice of integers

Because the line’s slope is irrational, it will never pass through any point of the integer lattice, but it will have many near misses. The near-miss points, with coordinates interpreted as numerator and denominator, are the accurate approximations to \(\pi\). The diagram suggests a measure of the merit based on distances. An approximation gets better when we minimize the distance of the lattice point from the origin as well as the vertical distance from the point to the \(\pi\) line. That’s the meaning of the formula with \(\sqrt{n^2 + d^2}\) in the denominator.

Another approach to defining merit simply counts digits. The merit is the ratio of the number of correctly predicted digits in the irrational target \(T\) to the number of digits in the denominator. A problem with this scheme is that it’s rather coarse. For example, \(13/4\) and \(16/5\) both have single-digit denominators and they each get one digit of \(\pi\) correct, but
\(16/5\) actually has a smaller error.

To smooth out the digit-counting criterion, and distinguish between values that differ in magnitude but have the same number of digits, we can take logarithms of the numbers. Let merit equal: \(-log(err) / log(d)\). (The \(log(err)\) term is negated because the error is always less than \(1\) and so its logarithm is negative.)

Here’s a comparison of the three merit criteria for some selected approximations to \(\pi\):

     n/d           1981 merit                  distance merit              log merit

     3/1        1.3867212448620723            7.016316181613145         --
    13/4        0.10027514901117529           2.1306165422053285        2.4284808488226544
    16/5        0.12196741168912356           3.208700907602539         2.4760467349663537
    19/6        0.20046843839209055           6.288264070960828         2.6960388788612515
    22/7        2.954192079226498           107.61458138965322          4.017563128080901
   179/57       0.04542369572848121          13.467303354323912         1.9381258641568968
   201/64       0.04623152429195394          15.390920494844842         1.9441196398907357
   223/71       0.04861784421796857          17.956388291625093         1.9573120958787444
   245/78       0.05310704607396699          21.548988850935377         1.9785253787278367
   267/85       0.06092284944437125          26.93965209372642          2.0098618723780515
   289/92       0.07506657421887829          35.92841360228601          2.055872071177696
   311/99       0.10469219759604646          53.921550739835986         2.1273838230139175
   333/106      0.1958822412726219          108.02438852795403          2.259868093766371
   355/113     53.76883630752973          31610.90993685001             3.444107245852723
 52163/16604    0.002495514149618044        215.57611105028013          1.6757260012234105
      •                  •                         •                            •
      •                  •                         •                            •
      •                  •                         •                            •
103993/33102    0.2892417579456485        49813.04849576935             2.1538978293241056
104348/33215    0.5006051667655171        86508.24042805366             2.2065386096084607
208341/66317    0.3403602724772912       117433.39822796892             2.1589243556399245
312689/99532    0.6343809166515098       328504.0552596196              2.207421489352196

All three measures agree that \(22/7\) and \(355/113\) are quite special. In other respects they give quite different views of the data. My weird 1981 formula compares \((n + d)^{-2}\) with \(err^{-1}\); the asymmetry in the exponents suggests the merit will tend to zero as \(n\) and \(d\) increase, at least in the average case. The maximum of the distance-based measure, on the other hand, appears to grow without bound. And the logarithmic merit function seems to be settling on a value near 2.0. This implies that we shouldn’t expect to see many \(n/d \) approximations where the number of correct digits is greater than twice the number of digits in \(d\). The late Tom Apostol and Mamikon A. Mnatsakanian proved a closely related proposition (“Surprisingly accurate rational approximations,” Mathematics Magazine, Vol. 75, No. 4 (Oct. 2002), pp. 307-310).

The final joke on my 1981 self is that all this searching for better approximants can be neatly sidestepped by a bit of algorithmic sophistication. The magic phrase is “continued fractions.” The continued fraction for \(\pi\) begins:

\[ \pi = 3+\cfrac{1}{7+\cfrac{1}{15+\cfrac{1}{1+\cfrac{1}{292+\cfrac{1}{1 + \cdots}}}}}\]

Evaluating the successive levels of this expression yields a sequence of “convergents” that should look familiar:

\[3/1, 22/7, 333/106, 355/113, 103993/33102, 104348/33215.\]

It is a series of “best” approximations to \(\pi\), generated without bothering with all the intervening non-“best” values. I produced this list in CoCalc (a.k.a. SageMathCloud), following the excellent tutorial in William Stein’s Elementary Number Theory. Even much larger approximants gush forth from the algorithm in milliseconds. Here’s the 100th element of the series:


A question remains: In what sense are these approximations “best”? It’s guaranteed that every element of the series is more accurate than all those that came before, but it’s not clear to me that they also satisfy any sort of compactness criterion. But that’s a question to be taken up another day. Perhaps on Continued Fraction Day.

Posted in computing, mathematics | 10 Comments