Intel: Chips Will Have to Sacrifice Speed Gains for Energy Savings

Move over, silicon.

Intel, the world’s largest chipmaker, is preparing to embrace alternatives to the technology that has sustained computing for more than 50 years. William Holt, who leads the company’s technology and manufacturing group, said this week that for chips to keep improving, Intel will soon have to start using fundamentally new technologies.

Holt said Intel doesn’t yet know which new chip technology it will adopt, even though it will have to come into service in four or five years. He did point to two possible candidates: devices known as tunneling transistors and a technology called spintronics. Both would require big changes in how chips are designed and manufactured, and would likely be used alongside silicon transistors.

However, the new technologies Holt cited would not offer speed benefits over silicon transistors, suggesting that chips may stop getting faster at the pace that the technology industry has been used to. The new technologies would, however, improve the energy efficiency of chips, something important for many leading uses of computing today, such as cloud computing, mobile devices, and robotics.

“We’re going to see major transitions,” said Holt, speaking at the International Solid State Circuits Conference in San Francisco. “The new technology will be fundamentally different.”

An Intel processor.

The chip industry has for decades been ruled by Moore’s Law, formulated by Intel cofounder Gordon Moore in 1965, which has become shorthand for continual, rapid progress in the capabilities of computers. Moore proposed that companies should double the number of transistors on a given area of a chip every two years to keep making better performing chips without out-of-control costs. Intel and others have produced processors with ever greater numbers of ever smaller and cheaper silicon transistors to keep that prediction alive. At the same time, transistors have become much more energy-efficient. Together those trends have enabled the development of supercomputers and laptops, smartphones and self-driving cars.

Holt said that they will continue to hold for two more generations, just four or five years, by which time silicon transistors will be only seven nanometers in size.

One of the two technologies Holt mentioned that might fill that gap, tunneling transistors, appears far from commercialization, although DARPA and industry consortium Semiconductor Research Corporation are funding research on the devices. They take advantage of quantum mechanical properties of electrons that harm the performance of conventional transistors and that have become more problematic as transistors have got smaller.

The circuits of an Intel processor.

Spintronic devices are closer to commercial production, and may even hit the market next year. They represent digital bits by switching between two different states encoded into a quantum mechanical property of particles such as electrons known as spin. Kang Wang, an electrical engineer at the University of California, Los Angeles, who works on spintronics, says Holt’s comments fit with his own expectations that spintronics will appear in some low-power memory chips in the next year or so, perhaps in high-powered graphics cards.

For example, Toshiba announced last year that it had developed an experimental spintronic memory array that consumed 80 percent less power than SRAM, a type of high-speed memory.

However, tunneling transistors and spintronics both have downsides beyond the fact they would require wholesale reëngineering of Intel’s manufacturing processes. Shrinking silicon transistors to keep Moore’s Law alive has made successive generations of chips both more powerful and less power-hungry. But the two new technologies can’t work on data as fast as silicon transistors. “The best pure technology improvements we can make will bring improvements in power consumption but will reduce speed,” said Holt.

That suggests that Moore’s Law as we’ve known it may come to an end. But Holt claimed that continued gains in energy efficiency, not raw computing power, are most important for the things asked of computers today.

“Particularly as we look at the Internet of things, the focus will move from speed improvements to dramatic reductions in power,” Holt said. Power is a problem across the computing spectrum. The carbon footprint of data centers operated by Google, Amazon, Facebook, and other companies is growing at an alarming rate. And the chips needed to connect many more household, commercial, and industrial objects from toasters to cars to the Internet will need to draw as little power as possible to be viable.

comments powered by Disqus