Why Intel is betting its chips on microprocessor mastermind Jim Keller
On a bucolic hillside in Hudson, Mass., about 30 miles west of Boston, on an early spring day last year, the semiconductor giant Intel threw a party for more than 30 of its designers, engineers, and other employees.
It was a pretty mature group, by tech industry standards—lots of polo shirts, cardigans and button-downs, very few hoodies. Indeed, virtually everyone in the crowd had been in tech since the mid-1990s or earlier. Before their Intel days, the partygoers had worked for Digital Equipment Corp., or DEC—a pioneering but long-gone tech giant that had once owned the very building that was hosting the party.
At the center of attention, mingling easily with old friends, was a man with a graying beard, beach-appropriate longish hair, and a muscular build—someone who looked more like a retired linebacker or professional surfer than the architect of some of the most important microchip designs of the past 30 years. The man was Jim Keller, the person Intel is counting on to revitalize its own struggling chip-design enterprise.
Though he is little known outside the computer industry, Keller is a chipmaking superstar, on the level of Frank Lloyd Wright in building architecture—or comparable to NBA coach Phil Jackson for the teams he has led to success. In a career that began at DEC in the 1980s, Keller has racked up remarkable achievements at stop after stop. His designs helped turn Advanced Micro Devices from a microchip also-ran into a respected contender. The autopilot chip in new Teslas that can recognize red lights and stop signs? That’s a Keller design. And the chips in everything from iPhones to Google’s cloud servers to an Xbox gaming console have some of Keller’s work at the core. Along the way, he has studied the management styles of tech’s biggest legends up close, among them Steve Jobs, Elon Musk, and AMD cofounder Jerry Sanders.
“He’s the Forrest Gump of our industry,” says Fred Weber, former chief technology officer of AMD. “He keeps being in the middle of the interesting stuff and making a difference.” And in April 2018, Forrest ran again, so to speak, when Keller left Tesla and joined Intel, the 800-pound gorilla of the chip industry—a company against which he’s been competing for most of his career.
In Hudson, Keller was essentially the guest of honor; he even chose the location. As he circulated, one striking factor was how little he talked about chips. Deb Bernstein, an Intel server-computer architect who also worked with Keller long ago at DEC, recalls Keller engaging her with observations about Buddhism, about physics. (In separate interactions with Fortune, Keller referenced lessons he’s gleaned from the biography of jet-fighter pilot John Boyd and from a book of spiritual teachings called The Untethered Soul.) “It’s like he has been getting Ph.D. after Ph.D., just through his constant quest to learn,” Bernstein says.
Keller himself uses an education analogy when describing his career. He divides his various stops into two categories: Some had lessons for him; some were more in need of lessons from him. “I’ve worked at really different places, and I think I learned valuable lessons at each one,” he tells Fortune. “When I went to Apple and Tesla, I didn’t go there to change anything. I went to change me. They don’t do things like anyone else in the world.”
Intel falls mainly in Keller’s second camp—those that need his lessons. Keller, who has the title of senior vice president in the technology, systems architecture, and client group, now manages about 10,000 employees in Intel’s semiconductor engineering arm. His efforts will help determine whether Intel, whose market share and profits have been sliding in the wake of some high-profile chip disappointments, can regain its preeminence in the eyes of the next generation of data wizards and device designers.
“With the transformation we are going through, Jim is absolutely the right engineering lead to be at the helm,” says Sailesh Kottapalli, Intel’s platform engineering group director, who has been with the company for almost 25 years. “I’ve never had a senior manager that had such an in-depth background and even an intuition about technology.”
To learn more about where that intuition has taken Keller, and where it might take Intel, Fortune interviewed more than 30 of Keller’s current and former colleagues and rivals. (Some asked to speak anonymously because they did not have permission from their employers to talk with us.) Together their accounts painted a picture of an unusually talented problem-solver—one whose career tracks some of the most important changes in computing over the past three decades.
From “Intel inside” to Intel in trouble
The field of microprocessor design is as critical as it is constantly evolving. As a result of the relentless advances in manufacturing first enunciated by Intel cofounder Gordon Moore in 1965 (the famous “Moore’s Law”), computer chips can fit an ever-increasing number of transistors on the same size piece of silicon. The latest iPhone A13 Bionic chip has 8.5 billion transistors written on a silicon wafer smaller than a dime. That has enabled the chips to perform more and more functions more quickly, while spreading from computers to our phones and cars and now to lampposts and garden sprinklers.
Thanks to Moore’s law, the chip-design and manufacturing process is analogous to adding new floors on top of an existing office building. Suddenly, there’s more real estate available at the same price, on the same lot. But it’s not so obvious exactly what to add or improve in the new space. Architects like Keller have to stay on top of any kind of trends in hardware or software that could improve a processor’s performance. They also have to keep an eye on what computer users are doing, to anticipate what they’ll need. And just to complicate matters further, sometimes the manufacturing advances necessary to build a new chip design don’t arrive on time.
Over the past decade, several elements of this complicated formula have played out badly for Intel. The company continues to make lots of money—it earned $21 billion in profits on $72 billion in revenue in 2019—but growth has slowed, and market share has slipped. Key products have been late. Some of the features Intel has chosen to focus on didn’t turn out to be what the market needed, and as a result, expensive forays into chips for tablets and 5G phones have had to be abandoned. AMD and Nvidia have outpaced Intel in building chips for big cloud data centers, one of the fastest-growing and most lucrative chip markets. Perhaps most critical for Intel’s future, a recent key A.I. acquisition, Nervana, didn’t gain traction. (Intel all but shut down the Nervana chip line after it bought Habana Labs last December for $2 billion, essentially admitting defeat and starting over.)
Intel has been grappling with a cultural problem, too. Its huge size and dominance in its sector, some believe, created a bureaucratic culture that moved slowly, communicated poorly, and got caught up in “everything-itis”—trying to put too many features in every new chip. What the company needed was a new focus and smarter direction from the top. And no one in the chip industry is better at focus and direction—at simplifying a problem, the better to solve it—than Jim Keller.
A digital wizard’s analog bookshelf
Jim Keller’s colleagues describe him as someone who reads incredibly widely, the better to approach chip design from fresh angles. Here are some of Keller’s top reads, with his comments.
Lost in Math, by Sabine Hossenfelder
“It’s surprising how much of what we think is true is not actually true. And that includes substantial areas of physics.”
The Untethered Soul, by Michael A. Singer
“This book takes apart what you think. When you take apart what you think and listen to it, who’s listening? And then what do you do?”
Boyd: The Fighter Pilot Who Changed the Art of War, by Robert Coram
U.S. Air Force pilot John Boyd “confronted many different problems and every time came up with something novel and interesting. A lot of it is useful in technology and working with teams.”
A digital education
In an industry now dominated by Stanford grads, triple-degree MIT alumni, and a few famous Harvard dropouts, Keller’s origin story stands out. Growing up in a postwar tract development in the Philadelphia suburbs, Keller couldn’t read until fourth grade, because of his dyslexia. He credits his supportive parents—dad a mechanical engineer at General Electric aerospace, mom staying home—for encouraging his curiosity without pressure.
Keller went to Penn State for college (MIT seemed “way too hard,” he recalls). He knew he wanted to do something “science-y, but also make money.” Early on, he found a list of the average starting salaries of various college majors. Biology and physics, his two top interests, fell below the halfway mark. Electrical engineering was near the top, so that was his choice (combined with a second major in philosophy). Keller’s adviser, fortuitously, ran the semiconductor lab, giving Keller an early exposure to one of the most important fields in tech.
After college and a bit of wandering, Keller wound up at DEC, a tech titan of its era. (Perhaps a bit prematurely, Fortune in 1986 declared cofounder Ken Olsen the “entrepreneur of the century.”) In the early 1980s, when Keller started, the fast-growing company was gobbling up office space all around Boston; one of Keller’s first workspaces was in a converted supermarket.
Unlike a lot of its rivals, DEC built its own chips rather than contracting with outside vendors. That meant Keller was exposed early on to all parts of the computer development process, from design through manufacturing and marketing. He learned about how chips were laid out and helped build CAD software tools used to map them. It was perhaps his first self-taught Ph.D. “I had a pretty big toolkit,” he says.
Among DEC’s most impressive products was a line of chips called Alpha. Alpha chips ran so-called workstation computers, the mighty calculating engines used by Wall Street traders, rocket scientists, and climate modelers—machines that stood above PCs on the food chain. Alphas were the jackrabbits of their class. The company submitted several of the chips to the Guinness Book of World Records, where they earned the title of fastest microprocessor in the world. Keller himself coheaded the design of a chip called the Alpha 21264, parts of which ran as fast as one gigahertz, an unheard-of speed for the time.
But for all that muscle, by the mid-1990s, DEC was suffering as a company. It had failed to anticipate how PCs and servers would eventually become powerful and fast enough to make DEC’s minicomputers obsolete. “It was literally true that we were building the world’s fastest computers and going out of business at the same time,” Keller recalls. And the experience taught him an indelible lesson: If you aren’t making what the market wants, it doesn’t matter how well you’re making it.
Eventually, Keller realized that DEC’s Alpha chips faced an extinction-level challenge—from Intel. At one point in the mid-1990s, Keller obtained a photo of the inner workings of Intel’s then-brand-new Pentium Pro processor. In the same room in Hudson where Intel would later throw Keller’s party, Keller explained Intel’s evolutionary leap to coworkers. Intel’s long-running x86 chip designs previously had run extremely complicated sets of instructions from software programs: In the Pentium Pro, Intel had figured out a way to quickly translate a program’s instructions into much simpler chunks. Sooner than anyone expected, Intel had all but eliminated Alpha’s main advantage.
Keller had connections at Intel’s second-fiddle rival in x86 chipmaking, AMD, and in 1998 he jumped ship, leaving DEC after 14 years. It was by far the longest he’d stay at any workplace in his career.
A breakthrough at an upstart
AMD at the time was run by its charismatic cofounder, CEO Jerry Sanders. Sanders’ great salesmanship skills weren’t enough to keep the company from being pounded in the marketplace by Intel. Keller helped change that, however, with a chip called the K8.
As chips became steadily more powerful, Keller recognized that a bottleneck was developing between the computing done on each chip and other parts of the computer, like the memory and storage. He also realized that as all the various components became smaller, it would be possible to simplify a chip system by integrating the processor with what had been separate chips that controlled memory and data transfers. And he had yet another simple insight: He could put two processor chips close together so they shared a single slot on a computer’s motherboard, a move that would speed up communications with memory.
Innovations like these would make the K8 ideal for use not just in regular PCs but also for the emerging market for server computers, then just catching on with businesses; Keller’s integrations would also greatly simplify the internal setup of every server, saving customers big bucks.
Up until then, Sanders had shot down proposals for AMD to work on chips for servers, arguing that AMD didn’t have the funds to develop the supporting ecosystem of compatible chipsets that those server chips would need. But Keller’s design for the K8 sneakily addressed those objections, by integrating the functions of those chipsets on board the main chip. (Keller still exercised discretion, he recalls: “I think we got, like, halfway through the project before we told him it was also good for servers.”) Last but not least, Keller cowrote a specification known as the HyperTransport spec, to allow data to flow between his new K8 chip and other servers.
The K8 was destined to become a huge success, helping turbocharge the growth of the server industry. Updated multiple times later, the HyperTransport standard is still widely used on servers, including on the chips running the cloud platforms at Amazon and Google.
But Keller didn’t stick around long enough to celebrate the victory. The K8 was still in early phases of development when Keller was invited by a group of former DEC colleagues, led by Alpha engineer Dan Dobberpuhl, to join a startup called SiByte. Keller jumped ship and left after barely a year at AMD—ending the first of several relatively short stints with employers that would become a hallmark of his career.
It was unusual for a designer to leave so early in a chip’s development, and some of Keller’s AMD team were disappointed with his decision. Some months later, the team held a confidence boosting dinner at a fancy French bistro. Afterward, the crew returned to their office and decided to remove a wall separating engineers from others on the team—on the spot, that night, with sledgehammers.
It was a team-building exercise, but also an expression of frustration at Keller’s departure, says Fred Weber, AMD’s chief technology officer at the time. “I would never say he doesn’t finish things—he absolutely does—but he leaves earlier than most,” Weber tells Fortune. “He is more of a front-end-of-a-project guy. The good news is his front ends do so much and set such a good direction.”
Keller describes his itinerant ways slightly less diplomatically. “I’m an engineer’s engineer,” he says. “Engineers like to work. I want to get the bullshit out of the way and get clear, interesting problems to solve.”
The K8 eventually came out in 2003. The chip was officially named Opteron—but AMD marketed one version as Sledgehammer.
Earth, sea, sky
The search for new problems to solve became a major feature of Keller’s leisure time, too. At DEC, as his seniority and income rose, he began buying a series of muscle cars—and using that muscle at high speeds on the roads. He also began pursuing an athletic avocation: windsurfing—making long trips from Boston to Hawaii with DEC colleagues to hit the best surfing sites.
One weekend, Dan Leibholz, who had worked for Keller at DEC, went to visit his boss at home. Keller had already given him some sailing lessons, but that day he pulled one of his windsurfing rigs out of the garage and gifted it to Leibholz, who’s now the chief technology officer of Analog Devices. “He wanted to make sure I’d actually go out and do it,” Leibholz recalls.
The pursuit of waves and highway thrills continued for Keller after he relocated to California in the late 1990s. Some former colleagues recall him regularly bidding for airline tickets to Hawaii on the “name your price” travel website Priceline. When his low bids won, he’d spend the weekend in Maui. Apple, where Keller worked for four years, encouraged employees to rent environmentally friendly cars on business trips, but Keller pushed the pokey rentals to the edge. On one trip, he racked up a speeding ticket while driving a rented hybrid; a coworker promptly mocked up an official-looking nameplate for Keller’s office door that identified his title as “Prius Racer.”
By the mid-2010s, Keller had added the more technically challenging sport of kitesurfing to his oceanic repertoire. He had also learned how to pilot stunt airplanes. “I had the utmost respect, but I wasn’t going to jump into one of those two-seater jets with him,” recalls John Byrne, a former top marketing and sales officer at AMD and now head of North America for Dell. Others did, and they tried not to lose the contents of their stomachs as Keller took them through barrel rolls and steep vertical climbs in the sky over Northern California.
Keller declined to discuss his high-adrenaline hobbies in detail. But Leibholz sees a connection between kitesurfing and Keller’s strength at finding seemingly simple solutions to thorny problems in chip design. Kitesurfing is “a fun thing to do, but it’s also incredibly technical and intense and really hard,” he says. “You’re riding above the waves, but right underneath that is an incredible amount of depth and complexity to what you’re doing.”
Lessons learned at Apple
Moving to Silicon Valley kept Keller close to the center of the action in chip design. SiByte, which focused on network processors, was bought in November 2000 by Broadcom, where Keller took on the title of chief architect. There, Keller pioneered so-called dual-core designs, which represented another leap in simplification: They essentially included the guts of two computing chips side by side on the same piece of silicon, which made the resulting chip faster and more energy efficient. Broadcom incorporated the chip into routers that moved huge amounts of data around the world; later in the decade, dual-core chips would make their way into PCs.
Keller remained restless for the next interesting problem. In 2004, he joined another Dobberpuhl startup called P.A. Semi that focused on chips for high-end PCs and servers. And in 2008 he jumped to Apple (just before Apple acquired P.A. Semi). For Keller, the attraction of the Cupertino juggernaut was twofold: to learn from one of the world’s toughest and most successful CEOs, Steve Jobs, and to immerse himself in the emerging field of smartphones.
The first three versions of Apple’s iconic iPhone ran on chips from Samsung: Keller joined a team tasked with designing Apple’s own line of chips. Starting with the iPhone 4, the phones used designs Keller had worked on. Jobs called the first chip the A4 though it was only the third chip used in the iPhone line, hoping the name might mislead any competitors who caught wind of it, Keller recalls. Keller had his greatest impact on Apple’s A6 and A7 chips, which powered the iPhone 5 and 5s. The designs weren’t just faster than those of competitors: Apple optimized the chips for smoother graphics, making iPhone rivals seem janky by comparison. The chips also accelerated the iPhone’s speech processing—just in time to power Apple’s new Siri digital assistant.
Keller didn’t work directly for Jobs: His bosses Bob Mansfield and Mike Culbert heard from the demanding and sometimes impetuous leader. But Keller says he learned a lot about “intense engineering,” from Jobs and Mansfield. “Their idea of focus is so much narrower than anything I’d ever seen,” Keller recalls. “You would die to hit your schedule.” Keller also absorbed a Jobs aphorism that would resonate with many of Keller’s most successful projects: “Once you know what’s the right thing to do, that’s all you should ever work on.”
By 2012, Keller was ready to put his new insights to work at an old employer—AMD. By then, AMD had lost the technical lead it got from the K8, and its leading-edge chips were far behind Intel’s best offerings. Keller could see why. He found AMD’s designs convoluted and difficult to improve—an example of something he’d witnessed before in the industry, when good engineers could get caught up optimizing their older chip designs for too long.
Keller saw an opportunity to start over with a clean slate. Chip manufacturers’ continual advances were making chips faster and more powerful, but they were also creating new problems: The fastest chips now often came very close to overheating, putting a ceiling on how much faster they could perform. But Keller had spotted a new technical advance that could help address the problem—chiplets.
Chiplets are essentially the Lego bricks of chip design: They’re smaller, separately manufactured blocks of silicon that can be snapped together to assemble a bigger, more complex chip. Keller realized that he could build new chips for highly computing-intensive activities—deep learning, for example, or graphics-rich video games—by snapping together a few chiplets. The resulting design would be less expensive than a single integrated chip, but still powerful—and the modular setup allowed Keller to add computing power without generating too much heat. The chiplets could also work in larger configurations for the huge servers that cloud computing data centers needed in growing numbers.
I’m an engineers’ engineer. Engineers like to work. I want to get the bullshit out of the way and get clear, interesting problems to solve.
Jim Keller
Implementing Keller’s idea would mean starting from scratch, and that fact evoked substantial internal opposition at AMD. Keller recalls people telling him to his face that he was going to flop; he responded by channeling his inner Steve Jobs. At one town hall meeting, says John Byrne, Keller became agitated enough to face down critics directly. “He told them, ‘This is what we’re doing. I’ve built the framework. Wait and you’ll see the outcome,’” Byrne recalls. “He had that maniacal focus.”
The first chips from Keller’s chiplet design, known as the Ryzen line, didn’t hit the market until 2017. They immediately created a stir by undercutting Intel chips on price and, in some cases, beating Intel on performance. By 2019, the third generation of Ryzen chips, still drawn from Keller’s design, was stomping the competition on nearly all measures. Not coincidentally, AMD’s stock price rose 2,303% in the five years through the end of April—a return almost 30 times as great as Intel’s 78%.
Characteristically, Keller was long gone by the time the Ryzen hit the market.
Computing on four wheels
Keller spent plenty of time pushing the speed limit in cars over the years. But he hadn’t really thought of cars as a computing challenge until 2015—when he heard an overture from some former Apple colleagues who had moved on to Tesla. Tesla founder Elon Musk wanted to build self-driving cars—a goal that required each car to carry substantial internal computing power. Musk had tried chips from Intel’s Mobileye and Nvidia, but still wasn’t satisfied.
In what amounted to his job interview, Keller convinced Musk that he could design a proprietary chip to run Tesla’s autopilot app 10 times as fast as the competition’s. At the same time, Musk convinced Keller that he was the kind of talented leader who could help the chip designer add another “Ph.D.” to his metaphorical résumé. Keller started in January 2016.
At Tesla, Keller’s key to success, once again, was simplification. Once he understood how Tesla’s software would operate, Keller found that he could leave out or minimize components that Nvidia included in its chips that weren’t as relevant to Tesla’s software. Keller’s chips began to be included in Tesla’s Series 3 and other models in 2019. By the company’s benchmarks, it offered a 20X performance jump, double what Keller had promised. Although regulators haven’t yet unleashed self-driving cars, the Tesla technology is impressive: A recent addition running on Keller’s chip design helps the Tesla 3 automatically stop at red lights and stop signs.
Keller also found himself fascinated by Tesla’s manufacturing operation; he liked to wander the factory floor in Fremont, Calif., and watch the cars being assembled. Observing the process led Keller to a revelation: While many parts of a car are meant to last five or 10 years, the electronic chips that powered the software would need to be updated more frequently, maybe every two or three years. Keller persuaded Tesla to reengineer how the computing components connected to the rest of the car, enabling the company to more easily swap out a chip board and update it. This new modularity now enables Tesla to promise free hardware upgrades for customers who pay for its self-driving feature.
Intel, at last
By the beginning of 2018, Intel was growing desperate for engineering help, amid delays in bringing new chips to market and the struggles of its tablet and 5G efforts. It was the very depth of those struggles that attracted Keller, who left Tesla and joined Intel, his on-and-off rival for decades, in April 2018.
“Intel reminds me of Digital [DEC],” Keller says. “It has the technical excellence and [culture of] collaboration, but sometimes the collaboration goes way too far.” One meeting he attended early on had 50 participants debating what he considered a simple topic. “At Tesla if that ever happened, Elon would just kill everybody,” he says. Drawing on insights he gained from both Musk and Jobs, Keller has been seeking to streamline procedures, reduce the size of teams, and cut back on meetings. He has also replaced all the nontechnical managers overseeing engineers in his division. “If you take a nontechnical manager a problem you’re having, [the manager] just has one more problem he can’t solve,” Keller explains.
Nearly 40 years into his career, Keller isn’t just relying on old relationships with the DEC gang at Intel. He’s been constantly expanding his network at every stop. Sundari Mitra, cofounder of a chip-design startup called NetSpeed, recalls visiting Tesla in February 2016 to present the company’s processor-design methodology to Keller. When the team’s first slide went up, Keller grimaced. Mitra sensed a fellow chip intellectual who didn’t need or appreciate the marketing chatter in her deck—and the two quickly began dissecting the problem in a deeper way on a whiteboard. “Jim doesn’t care about the high level, he gets that intuitively,” Mitra recalls. “He wanted the third level and the fourth and the fifth level.” Keller soon became Mitra’s mentor—and in September 2018, Intel acquired NetSpeed, reuniting the student and pupil.
Keller won’t talk much about the massive chip redesign he’s overseeing—chip designers seldom do—and Intel’s new chip probably won’t be ready for another year or two. Still, both Intel and Keller have scattered some clues about how the chips might work. The new chips will cleanly separate major functions, to make it easier for the company to improve one section at a time—an approach that evokes the chiplet model Keller used at AMD. Keller also hints that Intel’s low-power Atom line of chips may figure more prominently in his future designs for PCs and servers. Artificial intelligence capabilities are clearly on the agenda: Keller has been haunting A.I. symposia and reading prodigiously, learning everything he can about where the field of A.I. applications is likely to go for the next five or 10 years.
He’s the Forrest Gump of our industry. He keeps being in the middle of the interesting stuff and making a difference.
Fred Weber, former chief technology officer of AMD
Intel may also turn to outside vendors’ technology for some features, instead of always inventing its own solutions. “Prior to Jim coming, I don’t think that’s a transformation that we would have undertaken,” says Boyd Phelps, a vice president in Keller’s engineering group.
Analysts are split over whether Keller will be as successful at Intel as he’s been at so many of his past stops. “They need something big, that’s for sure, and he’s a big name,” says Bernstein Research analyst Stacy Rasgon. “I don’t know how he’s going to deal with all the legacy baggage. Check back in a few years.”
Of course, Keller has a history of converting legacy baggage into something lighter, simpler, and sleeker. After most of the country went into lockdown, Keller spoke with Fortune by phone from his home in Silicon Valley. His two teenage daughters were busy grappling with online classes, but Keller wasn’t worried about them. “These things normalize so fast,” he said. “Humans figure everything out.” And some humans do that much faster than others.
An all-star chip lineup
Over a nearly 40-year career, Jim Keller has designed some of the tech industry’s most impressive and groundbreaking microchips. Here are a few, along with the year they reached the market.
Digital Alpha 21264 (1996)
The first chip to run at 500 MHz, with a memory cache that hit 1 GHz—unheard-of speeds for its time. The Alpha also pioneered running software instructions out of order, to increase performance.
AMD Opteron (2003)
One of the first 64-bit processors, Opteron fit in servers and pioneered a data communications standard called the HyperTransport that’s still widely used in cloud-computing today.
Apple A4 (2010)
Keller focused on improving graphics capability. The chip that powered the original iPad and the iPhone 4 enabled Apple’s first high-resolution “retina” displays.
Tesla Autopilot (2019)
Tesla says that its first in-house A.I. chip, designed to enable autonomous driving, performs 20 times as fast as the Nvidia chip it replaced.
Intel Tremont (2020)
One of the first Keller-influenced designs from Intel, the low-power chip is intended to run small portable devices but could scale up to PCs.