AI Needs Enormous Computing Power. Could Light-Based Chips Help?

113

Moore’s law is already pretty fast. It holds that computer chips pack in twice as many transistors every two years or so, producing major jumps in speed and efficiency. But the computing demands of the deep learning era are growing even faster than that — at a pace that is likely not sustainable. The International Energy Agency predicts that artificial intelligence will consume 10 times as much power in 2026 as it did in 2023, and that data centers in that year will use as much energy as Japan. “The amount of [computing power] that AI needs doubles every three months,” said Nick Harris, founder and CEO of the computing-hardware company Lightmatter — far faster than Moore’s law predicts. “It’s going to break companies and economies.”

One of the most promising ways forward involves processing information not with trusty electrons, which have dominated computing for over 50 years, but instead using the flow of photons, minuscule packets of light. Recent results suggest that, for certain computational tasks fundamental to modern artificial intelligence, light-based “optical computers” may offer an advantage.

The development of optical computing is “paving the way for breakthroughs in fields that demand high-speed and high-efficiency processing, such as artificial intelligence,” said the University of Cambridge physicist Natalia Berloff.

Optimal Optical

In theory, light provides tantalizing potential benefits. For one, optical signals can carry more information than electrical ones — they have more bandwidth. Optical frequencies are also much higher than electrical ones, so optical systems can run more computing steps in less time and with less latency.

And then there’s the efficiency problem. In addition to the environmental and economic costs of relatively wasteful electronic chips, they also run so hot that only a tiny fraction of the transistors — — the tiny switches at the heart of all computers — can be active at any moment. Optical computers could, in theory, run with more operations taking place simultaneously, churning through more data while using less energy. “If we could harness” these advantages, said Gordon Wetzstein, an electrical engineer at Stanford University, “this would open a lot of new possibilities.”

Previous articleAndrea’s It-List: 6 practical (and stylish) fitness staples that everyone should own
Next articleThe International Criminal Court is seeking warrants for Israeli and Hamas leaders