Computer hardware has been one of the few areas where prices have fallen over time, while many other products have seen significant increases. Technological advances have led to consistent reductions in the cost of computing, but experts say that path may be coming to an end.
Getty Images/Emily Bogle/NPR
hide signature
switch signature
Getty Images/Emily Bogle/NPR
NPR series Cost of Living: The Price We Pay explores what drives rising prices and how people cope with years of stubborn inflation. How does rising prices change your lifestyle? Fill in this form share your story with NPR.
What is this item?
Laptop MacBook Pro
How has the price changed since the pandemic began?
It fell $200. Today's entry-level MacBook Pro starts at $1,599. It has a 14-inch screen, 16 gigabytes of memory and an internal 512 gigabyte solid-state hard drive. A comparable MacBook Pro from five years ago with the same memory and storage (but only a 13-inch screen) cost $1,799.
Why did the price drop?
Pricing is an art form, and price tags can be influenced by a wide range of factors beyond the cost of labor and materials—market positioning, competition, corporate culture, consumer psychology, and so on. Apple and other companies often maintain stable prices on key products as a strategic choice. (Fun fact: Apple also tends to set prices ending in the number 9: $999 for MacBook Air, $6,999 for Mac Pro, $549 for AirPods Max, etc.)
But there's a technical reason why computers in general have become cheaper over time: it's called Moore's Law.
Gordon Moore, a chip expert and co-founder of Intel, has suggested that the number of transistors on microchips will double approximately every 24 months due to advances in miniaturization technology. Transistors are small switches that provide digital processing. They control the flow of electricity—the ones and zeros of calculation.
As the number of transistors has decreased, the price per transistor—and thus the cost of computing—has dropped dramatically. The ability to reliably double the number of computers that could fit on a chip allowed computers to become smaller and more powerful without increasing their cost. It has given us computing power that was previously unavailable or even unimaginable.
This is the main reason why smartwatches are appearing on the mass market today that are more powerful than the computers of the Apollo 11 lunar mission. And that's why computers that were once such expensive behemoths that only businesses and universities could afford them are now small enough to fit on a desktop or in a pocket.
On Computer History Museum in Mountain View, California, Associate Professor Scott Stouter demonstrates an IBM 1401, a mainframe computer from the early 1960s. It occupies a room the size of a classroom and runs on punched cards and reel-to-reel tapes. It was once worth hundreds of thousands of dollars. And it only had the equivalent of 16 kilobytes of memory.
“At home, my laptop has 16 gigabytes of memory. That’s 16 billion bytes,” says Stouter. “That's a million times more than the maximum 1401 could have.”
Computer buyers can now get more bang for their buck, even over several years, as the MacBook Pro shows. And since chips are now used in everything, that means other types of electronics have become cheaper over time.
Take 55-inch OLED flat screen TVs for example. The first one hit the market in 2013 for over $10,000. Today you can buy one for less than $1,000. Smartphones are another example. Samsung's newest model starts at $999.99 in 2020. This year, the newest version cost $799.
Moore, who died in 2023, knew that his law had as much to do with economics as it did with physics. “I was just trying to convey the idea that integrated circuits would be the path to low-cost electronics, which was unclear at the time,” Moore said in a 2008 oral interview at the Computer History Museum archives.
What do people do about it?
They're used to it.
“Miniaturization happened very regularly, and people could count on it,” says Neil Thompson, an innovation specialist at MIT's Computer Science and Artificial Intelligence Laboratory and the university's Digital Economy Initiative.
Moore's Law allowed generations to believe that computers would always get better and to buy more of them. People can now own multiple computers—in the form of laptops, tablets, or smartwatches—as well as other devices with computers built into them, from cars to refrigerators.
But Moore's Law may be reaching its limit. Transistors are getting so small—tens of billions can now fit on a chip—that experts say the laws of physics are slowing the reliable pace of progress.
“In the heyday of Moore’s Law, miniaturization allowed us to get chips with more transistors, which also meant that each transistor used less power,” says Thompson. “Today, miniaturization gives us much less power reduction, so trying to cram too many transistors generates a lot of heat and can melt the chip.”
He says the predictability provided by Moore's Law will weaken over the next decade and that other technological breakthroughs will be needed to bring about new efficiency gains and lower prices.
One example is software. Thompson says steady progress, bolstered by Moore's Law, means computer system developers can get away with code that is sometimes ineffective. He says that by improving software, significant advantages in computing can be achieved.
Chip designers and makers say packaging chips is another way to squeeze more out of the technology. Packaging refers to the ways in which individual chips are connected to others to form powerful stacks.
Apple is a financial supporter of NPR.









