If you want to tile a bathroom floor, square tiles are the simplest option — they fit together without any gaps in a grid pattern that can continue indefinitely. That square grid has a property shared by many other tilings: Shift the whole grid over by a fixed amount, and the resulting pattern is indistinguishable from the original. But to many mathematicians, such “periodic” tilings are boring. If you’ve seen one small patch, you’ve seen it all.
In the 1960s, mathematicians began to study “aperiodic” tile sets with far richer behavior. Perhaps the most famous is a pair of diamond-shaped tiles discovered in the 1970s by the polymathic physicist and future Nobel laureate Roger Penrose. Copies of these two tiles can form infinitely many different patterns that go on forever, called Penrose tilings. Yet no matter how you arrange the tiles, you’ll never get a periodic repeating pattern.
“These are tilings that shouldn’t really exist,” said Nikolas Breuckmann, a physicist at the University of Bristol.
For over half a century, aperiodic tilings have fascinated mathematicians, hobbyists and researchers in many other fields. Now, two physicists have discovered a connection between aperiodic tilings and a seemingly unrelated branch of computer science: the study of how future quantum computers can encode information to shield it from errors. In a paper posted to the preprint server arxiv.org in November, the researchers showed how to transform Penrose tilings into an entirely new type of quantum error-correcting code. They also constructed similar codes based on two other kinds of aperiodic tiling.
At the heart of the correspondence is a simple observation: In both aperiodic tilings and quantum error-correcting codes, learning about a small part of a large system reveals nothing about the system as a whole.
“It’s one of those beautiful things that seems obvious in retrospect,” said Toby Cubitt, a quantum information researcher at University College London. “You’re like, ‘Why didn’t I think of that?’”
Forbidden Knowledge
Ordinary computers represent information using bits with two distinct states, labeled 0 and 1. Quantum bits, or qubits, likewise have two states, but they can also be coaxed into so-called superpositions in which their 0 and 1 states coexist. By harnessing more elaborate superpositions involving many qubits, quantum computers can perform certain computations much faster than any conventional machine.
Yet quantum superpositions are skittish creatures. Measure a qubit in a superposition state and it will collapse to either 0 or 1, wiping out any computation in progress. To make matters worse, errors stemming from feeble interactions between qubits and their environment can mimic the destructive effects of measurement. Anything that rubs a qubit the wrong way, whether it’s a nosy researcher or a stray photon, can spoil the computation.