TechNews Pictorial PriceGrabber Video Fri Nov 29 00:29:39 2024

0


The power of Moore??s Law
Source: Robert J. Samuelson


An Advanced Micro Devices Inc. AMD-A10-4600M Series APU computer chip in 2012. (Ashley Pon/BLOOMBERG)

Fifty years ago, in mid-April 1965, the trade magazine Electronics published an article by an obscure scientist making a seemingly preposterous prediction: The number of electronic components (transistors, resistors, capacitors) that could be squeezed onto integrated circuits ― what we now call computer chips ― would double every year for a decade. The author was Gordon Moore, and his forecast, subsequently slightly modified, is now known as Moore’s Law. We have been living with the consequences ever since.

Moore’s Law is a quiet rebuke to those who think we control our destiny. The historical reality is that technological, commercial and intellectual upheavals ― often unforeseen ― set in motion forces that create new opportunities and threats. We struggle to master what we haven’t anticipated and often don’t understand. The explosion of computing power imagined by Moore is one of those spontaneous transformations that define and dominate our era.

When Moore’s article appeared, it was less than a decade after the invention of the integrated circuit in the late 1950s, more or less simultaneously by Jack Kilby of Texas Instruments and Robert Noyce, a colleague of Moore’s at Fairchild Semiconductor, a startup. In 1965, Fairchild was preparing to deliver chips containing 64 separate components. Moore’s prediction of a doubling every year meant that by 1975, the number would total 65,000.

The figure was staggering. It also presupposed that integrated circuits would play a pivotal role in electronic innovation. This “claim seems self-evident today, but at the time it was controversial,” writes Chris Mack in the magazine IEEE Spectrum (the IEEE was once called the Institute of Electrical and Electronics Engineers). Costs were high, and “many people doubted that the integrated circuit would ever fill anything more than a niche role.”

Moore’s article aimed to rebut this skepticism. His prediction was not based on groundbreaking research. It merely extended existing advances in chip-making technology.

“At the time I wrote the article,” he told the IEEE Spectrum in an interview, “I thought I was just showing a local trend.” The number of components per chip had doubled annually, “and I just did a wild extrapolation.” In fact, the prediction was slightly optimistic. In 1975, Intel ― the now-famous firm that Moore and Noyce founded ― was designing chips with 32,000 components, reports Mack.

To make the same point differently: Moore’s Law is not a scientific truth in the sense that a given set of conditions always produces the same result. Rather, it is a loose and uncertain relationship based on simple observation. In a later article, Moore revised his doubling forecast from every year to every two years.

But something significant and peculiar happened, according to many observers. The faith in Moore’s Law became self-fulfilling. It inspired advances in miniaturization and design that kept multiplying chips’ computing power. Companies and engineers “saw the benefits of Moore’s Law and did their best to keep it going, or else risk falling behind the competition,” writes Mack.

The resulting explosion in computing power is almost unfathomable. A single chip today can contain 10 billion transistors. In 2014, global chip production was equal to 8 trillion transistors being produced every second, according to Dan Hutcheson of VLSI Research. Prices have collapsed. A single transistor is now worth a billionth of a penny. Even Moore has been surprised at the durability of Moore’s Law. Engineers and scientists have repeatedly defied formidable technical obstacles to expand chip capacity.

Of course, the economic, social and political implications are enormous. Information and communications technologies, led by the Internet, are driving widespread change. Here is economist Timothy Taylor, on his blog “Conversable Economist” last week, summarizing the impact of Moore’s Law:

“Many other technological changes ― like the smartphone, medical imaging technologies, decoding the human gene, or various developments in nanotechnology ― are only possible based on a high volume of cheap computing power. Information technology is part of what has made the financial sector larger, as the technologies have been used for managing (and mismanaging) risks and returns in ways barely dreamed of before. The trends toward globalization and outsourcing have gotten a large boost because information technology made it easier.”

That’s only half the story. To Moore’s Law’s virtues must be added the increasingly visible vices: rising cybercrime (your credit card may be electronically stolen), the threat of cyberwar (rogue groups and other nations may hack into crucial financial and infrastructure networks) and invasions of privacy. There’s more. As Taylor notes, information technologies have abetted economic inequality by destroying middle-income jobs.

Someday the multiplication of computing power will slow or stop, with what consequences it’s hard to say. That’s the point. History is not especially predictable or pliable. Moore’s Law is a parable for its mystifying ways.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |