Computers from Mac and PC, Android and iPhone, and everything in between, get faster. Unfortunately, the measurement we’ve used in the past few decades to mark such gains is dead. It’s Moore’s Law. It never was what you thought it was, it was not representative of a computer’s capability for users, and it’s really be dead for awhile.
Moore’s Law is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and Intel, whose 1965 paper described a doubling every year in the number of components per integrated circuit, and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years.
That worked for awhile but as most of us now know, it doesn’t mean much for computer users these days.
Humans have this interesting need to categorize and compare and contrast, and nothing says improvement or change than simply doubling something every two years. Double whatever must be better. Regarding the chips that run today’s computers and mobile devices, that doubling just doesn’t matter anymore; if it ever did.
Generally speaking, today’s CPUs– the complex central processing units inside each device– are faster, smaller, thinner, lighter, more capable, and use less energy than their predecessors. Apple’s A11 Bionic CPU used in the new line of iPhones benchmarks on a par with Intel’s 7th generation CPUs in entry-level MacBook Pro models. Does that mean iPhone 8 or iPhone X is as fast as a Mac notebook?
Yes. And no.
Apple’s own CPUs benchmark well, but those chips are designed for mobile devices, not desktop or notebook class applications. It’s not, as they say, Apple to apples. It’s just a benchmark. So it is, or, rather, was with Moore’s Law. Doubling transistors every few years in an ever smaller footprint meant more power but those gains were not always translated into, well, more power.
What we use our computers and devices for these days hasn’t changed as much as we might think. Photos and videos have higher quality and need more power from devices to edit and playback, but even that hasn’t changed much in a couple of decades. Audio files provide higher quality but they sound much the same. What really has happened is more capability– my iPhone has a few hundred applications vs. a few dozen on my MacBook Pro– in an ever smaller package.
In other words, all that power doesn’t mean we can type faster or write better or do much that is different from two years ago, or four years ago, or more years ago– except now we can carry that power and capability in our pockets.
That’s a first.
Moore’s Law was merely an indicator of how many transistors could be placed into an integrated circuit, and not an indicator of how we would use that extra power. What we’ve seen through the years, though, is a steady increase in power and capability in ever smaller devices.
Apple Watch today has more storage, more RAM, and a faster CPU than the original iMac from 1998. What does that say?