Tale of chip war: Part 14
Intel’s 'innovator’s dilemma'
On the stage of the Macworld Conference in San Francisco, history was about to be made in January, 2006. One after another, Apple co-founder Steve Jobs and Intel CEO Paul Otellini stepped onto the stage. That day, Jobs made a historic announcement: from now on, all Mac computers would run on Intel processors.
It was a watershed moment in Silicon Valley’s history because, until then, Apple had been the only major computer company not using Intel’s x86 chips. With that declaration, Intel’s dominance in the PC world reached its zenith.
Steve Jobs was already a living legend—an artist who could turn technology into beauty. Just a few years earlier, in 2001, his iPod had revolutionized the world of music. Paul Otellini, on the other hand, was a very different kind of leader—cool, analytical, and managerial. While Intel’s founders—Bob Noyce, Gordon Moore, and Andy Grove—were scientists and engineers, Otellini was an economist with an MBA degree.
When Otellini took over as CEO, Intel was a fortress of power and profit. His primary goal was to defend that stronghold and preserve its margins. Intel’s rise had been a mix of brilliant strategy and luck: in the 1980s, IBM had chosen Intel chips for its first PCs, and Microsoft built Windows to run on them.
Around the same time, at the University of California, Berkeley, a new idea in chip design was born—Reduced Instruction Set Computing (RISC). In many ways, it was more efficient than the aging x86 architecture. Intel’s legendary leader Andy Grove once considered moving the company toward RISC, but ultimately backed off, fearing it could undermine Intel’s lucrative dominance. So Intel stuck with the old x86 architecture.
Even so, during the 1990s and 2000s, the rise of the Internet and cloud computing only strengthened Intel’s empire. Not just personal computers—servers powering Amazon, Google, and Microsoft all ran on Intel chips. By the mid-2000s, Intel and AMD together controlled nearly 100% of the global data-center market. In short, the entire cloud was running on x86.
Meanwhile, across the Atlantic in Cambridge, England, a quiet revolution was underway. In 1990, Apple and two partners founded a small company called Arm, based on the RISC concept. Arm’s first CEO, Robin Saxby, realized that manufacturing chips was prohibitively expensive. So he proposed a radical new model: Arm would not build chips itself but license its designs to others. This “fabless” model transformed the entire industry. Companies could take Arm’s designs, customize them, and then have them manufactured by foundries like TSMC.
Arm couldn’t compete with the Intel–Microsoft alliance in the PC world. But its energy-efficient design triumphed in small, battery-powered devices. From Nintendo’s handheld consoles to mobile phones and tablets, Arm began spreading rapidly. Intel, focused on the highly profitable PC market, dismissed this “tiny” segment as trivial.
Some inside Intel saw the warning signs. In the 1990s, one manager held up a Palm Pilot at a meeting and said, “These mobile devices will one day replace PCs.” Senior executives laughed it off—how could a toy like that compete with billion-dollar PC sales? Intel’s two fortresses—PC and server processors—were so profitable that no one dared look beyond them.
At that very time, Harvard professor Clayton Christensen published his famous book The Innovator’s Dilemma, warning that successful companies often fail because their profitable businesses blind them to future disruptions. Intel became the perfect real-life embodiment of that theory.
Not long after Apple began using Intel chips in Macs, Steve Jobs made another proposal—could Intel build a processor for the upcoming iPhone? Jobs explained that the phone would be extraordinarily powerful, almost like a small computer, but its cost must stay within limits. Otellini ran the numbers and declined. That was the moment Apple turned to Arm. Together with Samsung, Apple built its first Arm-based chip for the iPhone.
When the iPhone went on to conquer the world, Intel realized its mistake—but it was too late. Even after pouring billions of dollars into mobile chips, Intel failed to enter the market. By then, Apple’s Arm-based ecosystem had become an unassailable fortress. Within a few years, the iPhone’s profits surpassed Intel’s entire PC chip revenue. The company that once powered the computer revolution had missed the next one.
Intel’s decline was not a technological failure—it still had the world’s best engineers and factories. The real issue was fear of losing profit margins. Otellini, who served as CEO from 2005 to 2013, later admitted that rejecting the iPhone proposal was primarily due to worries about lower profitability.
The engineers wanted innovation; management wanted profit. That is the heart of the Innovator’s Dilemma—success becomes so great that the company loses the courage to take risks. Intel’s fortress remained standing, but the world had already moved on.
(Adapted and abridged from Chapter 33, “The Innovator’s Dilemma,” of Chris Miller’s acclaimed book Chip War.)
Author: Mahmud Hossain, a graduate of BUET, has been a leading figure in Bangladesh’s telecom and ICT sectors for over three decades. He played a pivotal role in the introduction of mobile technology in the country. Currently serving as a Commissioner of the BTRC, he has previously held senior leadership positions in top local and international organizations.
Leave A Comment
You need login first to leave a comment