• Home
  • Latest
  • Fortune 500
  • Finance
  • Tech
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechAI

Seeking Big A.I. Advances, a Startup Turns to a Huge Computer Chip

By
Tiernan Ray
Tiernan Ray
Down Arrow Button Icon
By
Tiernan Ray
Tiernan Ray
Down Arrow Button Icon
August 19, 2019, 8:30 AM ET
Courtesy of Cerebras Systems

Tucked in the Los Altos hills near the Stanford University campus, in a low-slung bunker of offices across from a coffee shop, is a lab overflowing with blinking machines putting circuits through their paces to test for speed, the silicon equivalent of a tool and die shop. Most chips you can balance on the tip of your finger, measuring just a centimeter on a side. Something very different is emerging here. 

Andrew Feldman, 50, chief executive of startup Cerebras Systems, holds up both hands, bracing between them a shining slab the size of a large mouse pad, an exquisite array of interconnecting lines etched in silicon that shines a deep amber under the dull fluorescent lights. At eight and a half inches on each side, it is the biggest computer chip the world has ever seen. With chips like this, Feldman expects that artificial intelligence will be reinvented, as they provide the parallel-processing speed that Google and others will need to build neural networks of unprecedented size. 

Four hundred thousand little computers, known as “cores,” cover the chip’s surface. Ordinarily, they would each be cut into separate chips to yield multiple finished parts from a round silicon wafer. In Cerebras’s case, the entire wafer is used to make a multi-chip computer, a supercomputer on a slab. 

Companies have tried for decades to build a single chip the size of a silicon wafer, but Cerebras’s appears to be the first one to ever make it out of the lab into a commercially viable product. The company is calling the chip the “Wafer-Scale Engine,” or WSE—pronounced “wise,” for short (and for branding purposes).

“Companies will come and tell you, we have some little knob that makes us faster,” says Feldman of the raft of A.I. chip companies in the Valley. “For the most part it’s true, it’s just that they’re dealing with the wrong order of magnitude.” Cerebras’s chip is fifty-seven times the size of the leading chip from Nvidia, the “V100,” which dominates today’s A.I. And it has more memory circuits than have ever been put on a chip: 18 gigabytes, which is 3,000 times as much as the Nvidia part. 

“It’s obviously very different from what anyone else is doing,” says Linley Gwennap, a longtime chip observer who publishes a distinguished chip newsletter, Microprocessor Report. “I hesitate to call it a chip.” Gwennap is “blown away” by what Cerebras has done, he says. “No one in their right mind would have even tried it.”

Seeking a bigger chip for greater speeds

A.I. has become a ferocious consumer of chip technology, constantly demanding faster parts. That demand has led to a nearly $3 billion business almost overnight for the industry heavyweight, $91 billion Nvidia. But even with machines filled with dozens of Nvidia’s graphics chips, or GPUs, it can take weeks to “train” a neural network, the process of tuning the code so that it finds a solution to a given problem. Bundling together multiple GPUs in a computer starts to show diminishing returns once more than eight of the chips are combined, says Gwennap. The industry simply cannot build machines powerful enough with existing parts. 

“The hard part is moving data,” explains Feldman. Training a neural network requires thousands of operations to happen in parallel at each moment in time, and chips must constantly share data as they crunch those parallel operations. But computers with multiple chips get bogged down trying to pass data back and forth between the chips over the slower wires that link them on a circuit board. Something was needed that can move data at the speed of the chip itself. The solution was to “take the biggest wafer you can find and cut the biggest chip out of it that you can,” as Feldman describes it. 

To do that, Cerebras had to break a lot of rules. Chip designers use software programs from companies such as Cadence Design and Synopsis to lay out a “floor plan”—the arrangement of transistors, the individual units on a chip that move electrons to represent bits. But conventional chips have only billions of transistors, whereas Cerebras has put 1.2 trillion of them in a single part. The Cadence and Synopsis tools couldn’t even lay out such a floor plan: It would be like using a napkin to sketch the ceiling of the Sistine Chapel. So Cerebras built their own software tools for design. 

Wafers incur defects when circuits are burned into them, and those areas become unusable. Nvidia, Intel, and other makers of “normal” smaller chips can get around that by cutting out the good chips in a wafer and scrapping the rest. You can’t do that if the entire wafer is the chip. So Cerebras had to build in redundant circuits, to route around defects in order to still deliver 400,000 working cores, like a miniature internet that keeps going when individual server computers go down. The wafers were produced in partnership with Taiwan Semiconductor Manufacturing, the world’s largest chip manufacturer, but Cerebras has exclusive rights to the intellectual property that makes the process possible.

A side-by-side comparison of the Wafer-Scale Engine and one of the largest currently available A.I. chips.
Courtesy of Cerebras Systems

In another break with industry practice, the chip won’t be sold on its own, but will be packaged into a computer “appliance” that Cerebras has designed. One reason is the need for a complex system of water-cooling, a kind of irrigation network to counteract the extreme heat generated by a chip running at 15 kilowatts of power. 

“You can’t do that with a chip designed to plug into any old Dell server,” says Feldman. “If you build a Ferrari engine, you want to build the entire Ferrari,” is his philosophy. He estimates his computer will be 150 times as powerful as a server with multiple Nvidia chips, at a fraction of the power consumption and a fraction of the physical space required in a server rack. As a result, he predicts, A.I. training tasks that cost tens of thousands of dollars to run in cloud computing facilities can be an order of magnitude less costly.

Feldman’s partner in crime, co-founder Gary Lauterbach, 63, has been working on chips for 37 years and has 50 patents on the techniques and tricks of the art of design. He and Feldman are on their second venture together, having sold their last company, SeaMicro, to AMD. “It’s like a marriage,” says Feldman of the twelve years they’ve been collaborating.

Adding depth to ‘deep learning’

Feldman and Lauterbach have gotten just over $200 million from prominent venture capitalists because of a belief size matters in making A.I. move forward. Backers include Benchmark, which funded Twitter, Snap, and WeWork. It also includes angel investors such as Fred Weber, a legendary chip designer and former chief technology officer at Advanced Micro Devices; and Ilya Sutskever, an A.I. scientist at the well-known not-for-profit lab OpenAI and the co-creator of AlexNet, one of the most famous programs for recognizing objects in pictures. 

Today’s dominant form of artificial intelligence is called “deep learning” because scientists keep adding more layers of calculations that need to be performed in parallel. Much of the field is relying on neural nets designed 30 years ago, maintains Feldman, because what could fit on a chip up to now was limited. Still, even such puny networks improve as the chips they run on get faster. Feldman expects to contribute to a speed-up that will yield not just a quantitative improvement in A.I. but a qualitative leap in the deep networks that can be built. “We are just at the beginning of this,” he says. 

“What’s really interesting here is that they have done not one but two really important things, which is very unusual, because startups usually do only one thing,” says Weber, the former AMD executive. There’s the novel A.I. machine, but also the creation of a platform for making wafer-scale chips. The latter achievement could itself be worth a billion dollars in its own right, Weber believes, if Cerebras ever wanted to design chips for other companies; in his view, “They have created two Silicon Valley ‘unicorns’ in one company.”

A road-test awaits

It remains to be seen, however, whether Cerebras can keep all of the 400,000 cores humming in harmony, because no one has ever programmed such a large device before. “That’s the real problem with this huge number of cores,” says analyst Gwennap. “You have to divide up a task to fit across them all and use them all effectively.”

“Until we see benchmarks, it’s hard to assess how good the design is for A.I.,” says Gwennap. 

Cerebras is not disclosing performance statistics yet, but Feldman promises that will follow once the first systems ship to customers in September. Some have already received prototypes and results are competitive, Feldman asserts, although he is not yet disclosing customer names. “We are reducing training time from months to minutes,” he says. “We are seeing improvements that are not a little bit better but a lot.”

What gives Feldman conviction is not merely test results, but also the long view of a Valley graybeard. “Every time there has been a shift in the computing workload, the underlying machine has had to change,” he observes, “and that presents opportunity.” 

The amount of data to be processed has grown vastly larger in the “Big Data” era, but progress at Nvidia and Intel has slowed dramatically in terms of performance improvements. Feldman expects in years to come that A.I. will take over a third of all computing activity. If he’s right, just like in the movie “Jaws,” the whole world is going to need a bigger boat.

About the Author
By Tiernan Ray
See full bioRight Arrow Button Icon

Latest in Tech

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025

Most Popular

Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Finance
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam
By Fortune Editors
October 20, 2025
Rankings
  • 100 Best Companies
  • Fortune 500
  • Global 500
  • Fortune 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Fortune Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Fortune Brand Studio
  • Fortune Analytics
  • Fortune Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Fortune
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map
  • Facebook icon
  • Twitter icon
  • LinkedIn icon
  • Instagram icon
  • Pinterest icon

Latest in Tech

bytedance
AIHollywood
Top Hollywood screenwriter warns TikTok’s new tool is at the gates: ‘I hate to say it. It’s likely over for us’
By Barbara Ortutay and The Associated PressFebruary 16, 2026
7 hours ago
ring
LawAmazon
Amazon’s Ring decides maybe partnering with a police surveillance firm is a bad idea after wide revulsion at Super Bowl ad
By The Associated PressFebruary 16, 2026
8 hours ago
Brian Moynihan, chief executive officer of Bank of America Corp.
EconomyJobs
Brian Moynihan isn’t so worried about an AI jobs bloodbath, pointing to a 1960s theory that computers would end all management roles
By Eleanor PringleFebruary 16, 2026
9 hours ago
manyika
CommentaryScience
AI is transforming science – more researchers need access to these powerful tools for discovery  
By James Manyika and Demis HassabisFebruary 16, 2026
11 hours ago
Traders work on the floor of the New York Stock Exchange (NYSE) on February 13, 2026 in New York City.
InvestingMarkets
Trillion-dollar AI market wipeout happened because investors banked that ‘almost every tech company would come out a winner’
By Eleanor PringleFebruary 16, 2026
13 hours ago
Big TechChips
Rampant AI demand for memory is fueling a growing chip crisis
By Debby Wu, Takashi Mochizuki, Yoolim Lee and BloombergFebruary 15, 2026
24 hours ago

Most Popular

placeholder alt text
Economy
Social Security's trust fund is nearing insolvency, and the borrowing binge that may follow will rip through debt markets, economist warns
By Jason MaFebruary 15, 2026
1 day ago
placeholder alt text
Future of Work
Malcolm Gladwell tells young people if they want a STEM degree, 'don’t go to Harvard.' You may end up at the bottom of your class and drop out
By Sasha RogelbergFebruary 14, 2026
2 days ago
placeholder alt text
Real Estate
A billionaire and an A-list actor found refuge in a 37-home Florida neighborhood with armed guards—proof that privacy is now the ultimate luxury
By Marco Quiroz-GutierrezFebruary 15, 2026
1 day ago
placeholder alt text
Success
Meet the grandmother living out of a 400-ft ‘granny pod’ to save money and help with child care—it’s become an American ‘economic necessity’
By Emma BurleighFebruary 15, 2026
2 days ago
placeholder alt text
Economy
A U.S. 'debt spiral' could start soon as the interest rate on government borrowing is poised to exceed economic growth, budget watchdog says
By Jason MaFebruary 14, 2026
2 days ago
placeholder alt text
Commentary
Something big is happening in AI — and most people will be blindsided
By Matt ShumerFebruary 11, 2026
5 days ago

© 2026 Fortune Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Fortune Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.