NewsDecember 16, 2007
SAN JOSE, Calif. -- Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age. The devices -- whose miniaturization over time set in motion the race for faster, smaller and cheaper electronics -- have been shrunk so much that the day is approaching when it will be physically impossible to make them even tinier...
By JORDAN ROBERTSON ~ The Associated Press
Computer History Museum senior curator Dag Spicer looked at a replica of the first transistor Wednesday at the museum in Mountain View, Calif.
Computer History Museum senior curator Dag Spicer looked at a replica of the first transistor Wednesday at the museum in Mountain View, Calif.

SAN JOSE, Calif. -- Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age.

The devices -- whose miniaturization over time set in motion the race for faster, smaller and cheaper electronics -- have been shrunk so much that the day is approaching when it will be physically impossible to make them even tinier.

Once chip makers can't squeeze any more into the same-sized slice of silicon, the dramatic performance gains and cost reductions in computing over the years could suddenly slow. And the engine that's driven the digital revolution -- and modern economy -- could grind to a halt.

Even Gordon Moore, the Intel Corp. co-founder who famously predicted in 1965 that the number of transistors on a chip should double every two years, sees that the end is fast approaching -- an outcome the chip industry is scrambling to avoid.

"I can see [it lasting] another decade or so," he said of the axiom now known as Moore's Law. "Beyond that, things look tough. But that's been the case many times in the past."

Intel Corp. chief executive Paul Otellini showed off a wafer of new chips with billions of transistors Sept. 18 at the Intel Developers Forum in San Francisco. (Paul Sakuma ~ Associated Press)
Intel Corp. chief executive Paul Otellini showed off a wafer of new chips with billions of transistors Sept. 18 at the Intel Developers Forum in San Francisco. (Paul Sakuma ~ Associated Press)

Preparing for the day they can't add more transistors, chip companies are pouring billions of dollars into plotting new ways to use the existing transistors, instructing them to behave in different and more powerful ways.

Receive Daily Headlines FREESign up today!

Intel, the world's largest semiconductor company, predicts that a number of "highly speculative" alternative technologies, such as quantum computing, optical switches and other methods, will be needed to continue Moore's Law beyond 2020.

"Things are changing much faster now, in this current period, than they did for many decades," said Intel chief technology officer Justin Rattner. "The pace of change is accelerating because we're approaching a number of different physical limits at the same time. We're really working overtime to make sure we can continue to follow Moore's Law."

Transistors work something like light switches, flipping on and off inside a chip to generate the ones and zeros that store and process information inside a computer.

The transistor was invented by scientists William Shockley, John Bardeen and Walter Brattain to amplify voices in telephones for a Bell Labs project, an effort for which they later shared the Nobel Prize in physics.

Transistors' ever-decreasing size and low power consumption made them an ideal candidate to replace the bulky vacuum tubes then used to amplify electrical signals and switch electrical currents. AT&T saw them as a replacement for clattering telephone switches.

Since the invention of the integrated circuit in the late 1950s -- separately by Texas Instruments Inc.'s Jack Kilby and future Intel co-founder Robert Noyce -- the pace of innovation has been scorching.

The number of transistors on microprocessors -- the brains of computers -- has leaped from just several thousand in the 1970s to nearly a billion today, a staggering feat that has unleashed previously unimagined computing power.

Story Tags

Connect with the Southeast Missourian Newsroom:

For corrections to this story or other insights for the editor, click here. To submit a letter to the editor, click here. To learn about the Southeast Missourian’s AI Policy, click here.

Advertisement
Receive Daily Headlines FREESign up today!