Business

Inside the history of Intel's rise and its moment of disruption

the history of Intel's rise and its moment of disruption

Fast Company Contributor|Published

Intel rise and moment of disruption The history of Intel's rise and moment of disruption

Image: FastCompany

We have just witnessed one of the most bizarre developments in the history of technology. It involves a technology company that defined an era. Fast Company is documenting the history of Intel and its impact on society as artificial intelligence is challenging its grip on power. Here’s our first stab at the history of  technology companies as outlined by Harry McCracken.

On August 22, President Donald Trump announced via Truth Social that the U.S. federal government had acquired 10% of Intel. The chipmaker’s news release trumpeted the deal as “historic.” That it was. But it was also ignominious.

In Trump’s own account, he had humbled an iconic American company. Maybe even shaken it down. Fifteen days earlier, he posted to Truth Social that Intel CEO Lip-Bu Tan had ties to China, which left him “highly CONFLICTED” and demanded his immediate resignation. On August 11, Tan visited the White House. On August 22, the deal was official.

The U.S.’s ownership stake in Intel doesn’t involve any new funds. Instead, it’s a retroactive quid pro quo for $8.9 billion the company had already been granted but not yet paid through the U.S. CHIPS and Science Act. Joe Biden signed that bill three years ago, a $280 billion gambit to reverse the decades-long flight of chip manufacturing to Asia.

As Trump put it, Tan “walked in wanting to keep his job, and he ended up giving us $10 billion [the approximate value of the 10% stake] for the United States.” Notably, The Wall Street Journal’s Robbie Whelan, Yang Jie, and Amrith Ramkumar reported that Taiwan’s TSMC, the world’s largest chip manufacturer, pushed back against forking over any equity to the U.S.—even if declining to do so would require it to give up CHIPS Act money it was getting to help expand its production capacity in Arizona.

For anyone who lived through the PC’s heyday in the 1990s, seeing Intel run out of options is a stunning development. It’s not just that it dominated the market for PC processors so utterly that after years of shrinkage it still has close to a 75% share. More than any other company, Intel once controlled the moving parts that kept the technology business booming. In a sense, the entire PC industry became a front for it, to a degree that wasn’t obvious then and has since faded into history.

For instance, Intel engineers invented USB, maybe the most significant new PC technology of the 1990s. It didn’t pioneer Wi-Fi, but its decision to integrate it into a processor—2003’s Centrino—helped make it standard equipment on every laptop. When Apple’s thin-and-light MacBook Air became a hit and Intel was concerned that Windows portables were clunky by comparison, it came up with the Air-like Ultrabook; PC makers merely followed its lead.

Intel also supplied the motherboards many manufacturers used, allowing it to define a computer’s feature set and even its shape. In such cases, building a PC amounted to little more than filling out its platform with components such as memory and storage and wrapping a case around it.

The company’s grip on the industry wasn’t just technological. Starting in 1991, its “Intel Inside” campaign convinced millions of people to pay attention to the chips that powered computers. But Intel didn’t just buy ads on its own. It also established a co-op fund that paid up to half the cost of ads placed by PC companies. More than 500 of them participated in the program, including brand names such as Dell, HP, IBM, Sony, and Toshiba.

The co-op dollars manufacturers received were tied to the quantity of Intel processors they purchased. Naturally, ads subsidized by the fund were required to highlight the “Intel Inside” message. They also couldn’t mention models using chips from other companies—a stipulation that put longtime Intel rival AMD at a massive disadvantage, regardless of the quality of its products.

Intel’s co-op dollars indirectly paid for a disproportionate share of the entire PC business’s marketing budget. At first, magazines such as my former employer PC World benefited from the company’s largesse. Later, when Intel instructed manufacturers to divert ad dollars to the web, the magazines felt it in the pocketbook.

For a technology company, being all-powerful has its downsides. Andy Grove, Intel’s third employee and its CEO from 1987 to 1998, summed up his fear of resting on one’s laurels in the title of his 1996 bestseller, Only the Paranoid Survive. By the early years of this century, however, Intel began to radiate not paranoia but complacency. As the world changed, it didn’t. The results were disastrous.

In the 1990s, for example, Intel focused on integrating graphics into its CPUs rather than designing more powerful discrete graphics processors—the kind sold by smaller, specialized chipmakers such as Nvidia. Then Nvidia proved its graphics chips were also adept at running AI algorithms. Now it’s the world’s most valuable public company, with a market cap quadruple the size of Intel’s.

And then there were phones. Intel talked to Apple about providing the processor for the first iPhone, but concluded it couldn’t turn a profit on the deal. Smartphones went on to be bigger than the PC ever was—and almost none of them ever had Intel Inside.

For years, Intel had been synonymous with Moore’s law, its cofounder Gordon Moore’s observation that the number of transistors that could fit on an integrated circuit doubled every two years. As tech’s engines of progress moved beyond the PC, even its ability to stay on the cutting edge of chip manufacturing faltered. When longtime executive Pat Gelsinger rejoined the company as CEO in 2021, it was in desperate need of a turnaround.

Gelsinger’s ambitious strategy involved getting its manufacturing advances back on track and becoming a contract manufacturer (“foundry”) for chips designed by others. But Intel’s board ousted him after less than four years, before his vision could play out. That led to Tan’s appointment and, five months later, the Trump deal.

BY HARRY MCCRACKEN

Harry McCracken is the global technology editor for Fast Company, based in San Francisco. He writes about topics ranging from gadgets and services from tech giants to the startup economy to how artificial intelligence and other breakthroughs are changing life at work, home, and beyond.

Intel rise and moment of disruption The history of Intel's rise and moment of disruption

Image: FastCompany